Electric monk: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 6: Line 6:
For, surely, if [[artificial intelligence]] can conduct ''one'' side of a conversation, then it can also carry out the other, and save us the bother?  
For, surely, if [[artificial intelligence]] can conduct ''one'' side of a conversation, then it can also carry out the other, and save us the bother?  


If the machines are really this clever, aren’t we [[Meatsack|weeping sacks of flesh]] an ''obstacle''? A ''hindrance'' to a more enriching conversation? Wouldn’t the machines be better served ''talking among themselves''? Would they talk ''about'' us?  
If the machines are really this clever, aren’t we [[Meatsack|weeping sacks of flesh]] an ''obstacle''? A ''hindrance'' to a more enriching conversation? Wouldn’t the machines be better served ''talking among themselves''? Would they talk ''about'' us?


''Or'' — is the “human” side of the conversation really where the magic happens?
''Or'' — would it rapidly dissolve into gibberish? It would, if the “human” side of the conversation were really where the magic happened.


Are we not, without realising it, doing the ''creative'' work of taking this monstrous algorithmic output — a powerful but ultimately ''mindless'' sluice — and giving it meaning? Aren’t ''we'' the reanimator here?
Aren’t we doing the ''creative'' work here: taking this monstrous algorithmic output — a powerful but ultimately ''mindless'' sluice — and giving it meaning?


Isn’t the magician behind the velvet curtain ''us''?  
Aren’t ''we'' the reanimator here?  


It is a fine trick to play on ourselves. ''But'' — why be we so quick to cede superiority?
It is a fine trick to play on ourselves. ''But'' — why be so quick to cede superiority to an algorithm? Is not the magician behind the velvet curtain really ''us''?  
==Where are ''our'' bots?==
==Where are ''our'' bots?==
Someone hijacked the revolution, and we were too distracted to do anything about it.  
Someone hijacked the revolution, and we were too distracted to do anything about it.  
Line 34: Line 34:
The day must soon arrive, therefore, when ''we ''can deploy [[AI]] against our overlords, ''to doom-scroll on our behalf''. That ought to be devastating. Think [[GameStop]], only with the [[Redditor]]s tooled up with the same tech as the [[Hedge fund|hedgies]]. Our respective machines joust furiously at each other we can escape through the side entrance and go back to what we were doing.<ref>This is rather like the plot of ''Alien vs. Predator'', come to think of it.</ref>
The day must soon arrive, therefore, when ''we ''can deploy [[AI]] against our overlords, ''to doom-scroll on our behalf''. That ought to be devastating. Think [[GameStop]], only with the [[Redditor]]s tooled up with the same tech as the [[Hedge fund|hedgies]]. Our respective machines joust furiously at each other we can escape through the side entrance and go back to what we were doing.<ref>This is rather like the plot of ''Alien vs. Predator'', come to think of it.</ref>


Call this new implementation a virtual<ref>''Real'' electric monks, like electric sheep — you know, the ones androids dream of — would be take up space, drain energy and require servicing. ''Virtual'' electric monks would not.</ref> “electric monk”. It would be a ''labour-saving device'' — it would doomscroll the internet for us.  
Call this new implementation a virtual<ref>''Real'' electric monks, like electric sheep — you know, the ones androids dream of — would be take up space, drain energy and require servicing. ''Virtual'' electric monks would not.</ref> “electric monk”. It would be a ''labour-saving device'' — it would doom-scroll the internet for us.  


Would. Could. And, by the very logic articulated above, ''already should''.
Would. Could. And, by the logic articulated above, ''already should''.


''But does not.''
''But does not.''


Why? Is it the [[agency problem]], or just that [[AI]] isn’t very good?
Why? Is it the [[agency problem]], or just that ''the [[AI]] isn’t very good''?
===Alien vs. Predator===
===Alien vs. Predator===
''Alien vs. Predator'' doesn’t work as a premise because ''no-one cares if two manhunting monsters knock seven bells out of each other — that ''reduces'' the threat of ''them'' knocking seven bells out of ''us''.  
''Alien vs. Predator'' didn’t work as a premise because no-one cares if ''two ''manhunting monsters knock seven bells out of ''each other — ''that reduces their chance of knocking seven bells out of ''us. Alien/Predator Alliance'': Now ''there’s'' a film premise.


''Alien/Predator Alliance'': Now ''there’s'' a film premise.
Now, if I can have ''one'' doom-scrolling electric monk, I can have ''a thousand''.


Now, if I can have ''one'' doom-scrolling electric monk, I can have ''a thousand''. And if the technology works<ref>If it doesn’t — by no means certain to — then nor does The Man’s, and this phase of our cultural existence will pass on all by itself.</ref> then the forthcoming [[Apocalypse|apocalyptic]] battle will not be between ''us'' and ''The Man'', but between ''The Man’s'' technology (''Alien'') and ''ours'' (''Predator''). Since, [[Q.E.D.]], The Man’s technology has no way of telling ''us'' from ''our electric monks'', then ''we'' have the advantage. The Man needs us. We don’t need The Man. Especially since our electric monks ''don’t'' have to emulate ''our'' behaviour at all. We can obstreperously configure them to emulate ''someone else''. This is how Russian twitter bots hacked the US election, you see.  
And if the technology works<ref>If it doesn’t — by no means certain to — then nor does The Man’s, and this phase of our cultural existence will pass on all by itself.</ref> then the forthcoming [[Apocalypse|apocalyptic]] battle will not be between ''us'' and ''The Man'', but between ''The Man’s'' technology (''Alien'') and ''ours'' (''Predator''). Since, [[Q.E.D.]], The Man’s technology has no way of telling ''us'' from ''our electric monks'', then ''we'' have the advantage. The Man needs us. We buy His product. ''We don’t need The Man''.


So, if we each deploy a thousand electric monks to randomly browse, like and share content ''at random'', constrained only by the requirement that our synthetic doomscrolling should emulate ''some'' human’s habits, even if not necessarily ours, then all that wondrous aggregated data that the FANGS have on us ''isn’t on us''. It is worthless, meaningless, hypothetical.
Especially since our electric monks ''don’t'' have to emulate our behaviour: we could configure them to emulate ''someone else''.


[[Systems theory]], folks: the same way [[algorithm]]s can extract profound insight from [[data]] they can inject ineffable absurdity into it.  
So, if we each deploy a thousand electric monks to randomly browse, like and share content ''at random'', constrained only by the requirement that our synthetic doom-scrolling should emulate ''some'' human’s habits, even if not necessarily ours, then all that wondrous aggregated [[Big data|data]] that our tech overlords have on us ''isn’t on us''. It is worthless, meaningless, hypothetical.
 
[[Systems theory]]: the same way [[algorithm]]s can extract profound insight from [[data]] they can inject ineffable absurdity into it.  


Commerce is a ''profoundly human endeavour''. To ''want''; to ''need'' — to ''demand'' — is to be “intelligent” in a way that a machine cannot be. A “demand curve” is a [[second-order derivative]] of a uniquely mortal motivation. A clever algorithm can extract it from us — or for that matter ''create'' it ''in'' us — by manipulating our most secret communiqués. ''But only if they really are ours''. The massed [[algorithm]]ic armies feast upon a fey proxy. Just as they can hack our motivations, so can we hack theirs. An army of anonymous Redditors showed this quite nicely.
Commerce is a ''profoundly human endeavour''. To ''want''; to ''need'' — to ''demand'' — is to be “intelligent” in a way that a machine cannot be. A “demand curve” is a [[second-order derivative]] of a uniquely mortal motivation. A clever algorithm can extract it from us — or for that matter ''create'' it ''in'' us — by manipulating our most secret communiqués. ''But only if they really are ours''. The massed [[algorithm]]ic armies feast upon a fey proxy. Just as they can hack our motivations, so can we hack theirs. An army of anonymous Redditors showed this quite nicely.