Artificial intelligence: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 8: Line 8:
The thing is, there is some genuinely staggering AI out there, but it ain’t in regtech — it is in the music industry. The AI drummer on Apple’s Logic Pro. That’s amazing, and that really is putting folks out of work. Likewise Izotope’s mastering plugins.
The thing is, there is some genuinely staggering AI out there, but it ain’t in regtech — it is in the music industry. The AI drummer on Apple’s Logic Pro. That’s amazing, and that really is putting folks out of work. Likewise Izotope’s mastering plugins.


If only Izotope realised how much money there was in regtech, they wouldn’t be faffing around with hobbyist home recording types like yours truly.
If only Izotope realised how much money there was in [[reg tech]], they wouldn’t be faffing around with hobbyist home recording types like yours truly.
===The theoretical reason your job is safe===
Strap in: we’re going [[Epistemology|epistemological]]. Evolution does not converge on a platonic ideal. It ''departs'' from an imperfect ''present''. It isn’t intelligent; it is a blind, random process reacting to [[Known known|known knowns]], and unable to anticipate the future — that place where, as [[Criswell]] so rightly said, you and I are going to spend the rest of our lives. Those known knowns are a random, complex and unpredictable environment. What is a dominant factor today might not be tomorrow<ref>Just ask the inventors of, oh, the canal, the cassette, the mini-disc, the laserdisc, the super-audio cd, VHS, betamax, the mp3 player ...</ref>, and the interdependencies, foibles and illogicalities that interact to create that process are so overwhelmingly [[Complexity|complex]] that it is categorically impossible to predict. If you could, the pathway from a sharpened stick to the singularity would be dead straight, the development of technology would be a relentless process of refinement, all mysteries could be solved by algorithm, and ''we would be there by now''.


BUT THAT’S NOT AT ALL HOW IT WORKS.
Evolutionary design space is at least four-dimensional (if you include time) and for all we know may curve with gravity into other dimensions too. If you were to regard it from a dispassionate, stationary frame of reference — you know, a magnaetic anomaly planted on the moon or something, you would see technology progress, like fashion, hurtling chaotically around the room like a deflating balloon.
From our position, on that balloon, the parts of design space we leave behind seem geometrically more stupid the further away we get.
Bear in mind also that an artefact’s value is in large part a function of its cost of production. Absent a monopolistic impulse, its costs of production is a ''ceiling'' on its value. The moment you can digitise and automate something costlessly, its value drops to zero. No-one will pay you to do this for them ''because they can do it for themselves''. This is just one of those unpredictable illogicalities that randomises the trajectory of evolutionary design.
Proposition therefore: ''an activity’s value of depends on its cost/difficulty of production''. If you automate a hitherto complex operation, the value of your offering drops proportionately and you need to make it ''more'' complex. Not the complexity resides not inside that automated component, but in how that automated component interacts with the other components. By removing a certain degree of cost and uncertainty, you’ve added cost and uncertainty at a more abstract scale. The [[subject matter expert]]ise required to understand that wider, systematic complexity is far greater than was required to manage that component pre-automation (by definition, right? you just automated it with a machine that does it for free. How hard can it be?).


===The actual reason [[why your job is safe]]===
===The actual reason [[why your job is safe]]===
Line 43: Line 54:
*{{aiprov|On machine code and natural language}}
*{{aiprov|On machine code and natural language}}
{{ref}}
{{ref}}
{{devil}}