Template:Data as a self-fulfilling prophecy: Difference between revisions
Amwelladmin (talk | contribs) No edit summary |
Amwelladmin (talk | contribs) No edit summary |
||
Line 9: | Line 9: | ||
The more data there is, the more you can process, the more [[neural network|neural networks]] can crawl over it, [[pattern matching]] and framing and analysing and the more “insight” — unexpected, machine-generated insight — we can extract. | The more data there is, the more you can process, the more [[neural network|neural networks]] can crawl over it, [[pattern matching]] and framing and analysing and the more “insight” — unexpected, machine-generated insight — we can extract. | ||
====Data is theory-dependent=== | ====Data is theory-dependent==== | ||
But this insight is a function of the data: we can’t analyse or pattern-match data we don’t have — and what we do have we must select, filter, format, array and frame according to some pre-existing [[Narrative|theory of the game]]. Our data paints a picture from shadows: by blocking out “irrelevant” data we have collected but which doesn’t advance, bear upon or fit our theory. The more data we have, the more of it we must block to make a meaningful model. | But this insight is a function of the data: we can’t analyse or pattern-match data we don’t have — and what we do have we must select, filter, format, array and frame according to some pre-existing [[Narrative|theory of the game]]. Our data paints a picture from shadows: by blocking out “irrelevant” data we have collected but which doesn’t advance, bear upon or fit our theory. The more data we have, the more of it we must block to make a meaningful model. | ||
Revision as of 20:05, 23 October 2023
Knowledge is knowing that a tomato is a fruit; wisdom is not putting it in a fruit salad.
- —Miles Kington
There is, the JC freely narratises, an epochal battle raging between wisdom and technocracy which the technocrats have, for thirty years, been winning. As we are gradually immersed in the superficial charm of technology, it feels like an end-game: there is no way out for the meatware: no titanic clash, no great final conflict — just a feeble whimpering out of human expertise, finally beaten down by the irrepressible energy of the algorithm. The latest front, artificial intelligence, feels like a coup de grace the inevitable endpoint of human uselessness.
Weisendämmerung: the twilight of the wise.
Wisdom only comes with time, experience and anecdotally-accumulated expertise. It is hard to acquire and expensive to buy. Technology, by contrast requires no time and no expertise: only brute information processing capacity — which gets ever cheaper — and enough data to process — which gets ever more abundant.
The more data there is, the more you can process, the more neural networks can crawl over it, pattern matching and framing and analysing and the more “insight” — unexpected, machine-generated insight — we can extract.
Data is theory-dependent
But this insight is a function of the data: we can’t analyse or pattern-match data we don’t have — and what we do have we must select, filter, format, array and frame according to some pre-existing theory of the game. Our data paints a picture from shadows: by blocking out “irrelevant” data we have collected but which doesn’t advance, bear upon or fit our theory. The more data we have, the more of it we must block to make a meaningful model.
So, dilemma: the less data we have, the stronger the model, but the less reliable the insight, because we don’t know what we’re missing. The more data, the weaker the model, and the less reliable the insight, because we still don’t know what we’re missing, but the more of what we do know we have had to rule out to draw a single coherent model.
We don’t know how experts do what they do — that ineffability is their expertise — until they acquire tools to help them — digital tools — and they make us lazy, at the same pushing experts towards activities that generate metadata (the tools don’t help with ineffable stuff that doesn’t generate metadata) that the technocrats can collect. (A conversation across the desk is purely analogue; it contains no recordable data or metadata; a typed letter is an analogue artefact with no metadata; a facsimile is a digital graphic of an analogue artefact with limited extractable data or metadata; an electronically transmitted ASCII document is only data, and has no meaningful analogue existence at all)
And as the talent loses, we succumb to data, increasingly giving it off, great clods of it, which the technocrats then harvest and weaponise back at us in some self-fulfilling apocalyptic prophecy. No matter that the data are necessarily historical: a formalistic digital sketch of a model; they can only see what they can see: they cannot measure of the value of actions not taken, crises headed off; investment costs avoided through quick thinking and untraced application of human common sense, because necessarily, there is no data about did not happen.
The technocrats build tools to make lives easier which happen as a by-product to generate data, and then the data is all the residue that remains, not the lives made easier.
And the more data we give off, the more it emboldens the technocrats: the more it seems to be universal, and all-telling to immerse themselves in an alternative universe described by the data.