Template:Data as a self-fulfilling prophecy: Difference between revisions

Jump to navigation Jump to search
No edit summary
 
Line 9: Line 9:


The more data there is, the more you can process, the more [[neural network|neural networks]] can crawl over it, [[pattern matching]] and framing and analysing and the more “insight” — unexpected, machine-generated insight — we can extract.
The more data there is, the more you can process, the more [[neural network|neural networks]] can crawl over it, [[pattern matching]] and framing and analysing and the more “insight” — unexpected, machine-generated insight — we can extract.
====Data is theory-dependent====
====Observation of “data” is theory-dependent====
But this insight is a function of the data: we can’t analyse or pattern-match data we don’t have — and what we do have we must select, filter, format, array and frame according to some pre-existing [[Narrative|theory of the game]]. Our data paints a picture from shadows: by blocking out “irrelevant” data we have collected but which doesn’t advance, bear upon or fit our theory. The more data we have, the more of it we must block to make a meaningful model.
But this insight is a function of the data: we can’t analyse or pattern-match data we don’t have — and what we do have we must select, filter, format, array and frame according to some pre-existing [[Narrative|theory of the game]]. Our data paints a picture from shadows: by blocking out “irrelevant” data we have collected but which doesn’t advance, bear upon or fit our theory. The more data we have, the more of it we must block to make a meaningful model.


So, dilemma: the ''less'' data we have, the stronger the model, but the less reliable the insight, because we don’t know what we’re missing. The ''more'' data, the weaker the model, and the less reliable the insight, because we still don’t know what we’re missing, but the more of what we ''do'' know we have had to rule out to draw a single coherent model.
So, dilemma: the ''less'' data we have, the stronger the model, but the less reliable the insight, because we don’t know what we’re missing. The ''more'' data, the weaker the model, and the less reliable the insight, because we still don’t know what we’re missing, but the more of what we ''do'' know we have had to rule out to draw a single coherent model.


We don’t know how experts do what they do — that ineffability is their expertise — until they acquire tools to help them — ''digital'' tools — and they make us lazy, at the same pushing experts towards activities that generate metadata (the tools don’t help with ineffable stuff that doesn’t generate metadata) that the technocrats can collect. (A conversation across the desk is purely analogue; it contains no recordable data or metadata; a typed letter is an analogue artefact with no metadata; a facsimile is a digital graphic of an analogue artefact with limited extractable data or metadata; an electronically transmitted ASCII document is ''only'' data, and has no meaningful analogue existence at all)
''Data proves '''nothing''' in the abstract. It can be made to prove '''anything''' in the particular.''


And as the talent loses, we succumb to data, increasingly giving it off, great clods of it, which the technocrats then harvest and weaponise back at us in some self-fulfilling apocalyptic prophecy. No matter that the data are necessarily historical: a formalistic digital sketch of a model; they can only see what they can see: they cannot measure of the value of actions not taken, crises headed off; investment costs avoided through quick thinking and untraced application of human common sense, because necessarily, ''there is no data about did not happen''.  
====How experts work====
We don’t know how experts do what they do. That ineffability is their very expertise, since if we did know, we wouldn’t need them. As experts increasingly use digital tools, though, they spin off more and more ''data'' that the technocrats can collect and analyse. (A conversation across the desk is purely analogue; it contains no recordable data or metadata; a typed letter is an analogue artefact with no metadata; a facsimile is a digital graphic of an analogue artefact with limited extractable data or metadata; an electronically transmitted ASCII document is ''only'' data, and has no meaningful analogue existence at all).
====How technocrats work====
As we succumb to data, increasingly giving it off, great clods of it, which the technocrats then harvest and weaponise back at us in some self-fulfilling apocalyptic prophecy. Because they can measure, they do measure. No matter what they measure is meaningless, or that the data are necessarily historical: a formalistic digital sketch of a model; they can only see what they can see: they cannot measure of the value of actions not taken, crises headed off; investment costs avoided through quick thinking and untraced application of human common sense, because necessarily, ''there is no data about did not happen''.  


The technocrats build tools to make lives easier which happen as a by-product to generate data, and then the data is all the residue that remains, not the ''lives made easier''.
We design digital tools to make lives easier. They happen, as a by-product to generate metadata. Though the metadata was not the reason for the tool, it becomes the justification for it. When the ineffable magic has happened, and evaporated into the atmosphere, the metadata is all the residue that remains, not the ''lives made easier by the magic''.
 
And the more data we give off, the more it emboldens the technocrats: the more it seems to be universal, and all-telling to immerse themselves in an alternative universe described by the data.

Navigation menu