Signal-to-noise ratio: Difference between revisions

no edit summary
No edit summary
No edit summary
 
(11 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{a|devil|
{{freeessay|systems|signal-to-noise ratio|{{image|Infinity|png|Where<br>“n” is the data in which you trust; and<br>“x” is the data you haven’t got yet.}}}}
[[File:Infinity.png|450px|thumb|center|where<br>“n” is the data in which you trust; and<br>“x” is the data you haven’t got yet.]]
}}{{quote|
''Caught in a mesh of living veins,<br>
''In cell of padded bone,<br>
''He loneliest is when he pretends<br>
''That he is not alone.<br>
<br>
''We’d free the incarcerate race of man<br>
''That such a doom endures<br>
''Could only you unlock my skull,<br>
''Or I creep into yours.<br>
:—{{Author|Ogden Nash}}, ''Listen...''}}
 
{{quote|
''In God we trust, all others must bring data.
:—Edwin R. Fisher}}
 
{{quote|''{{Taleb antifragile signal to noise}}''}}
 
If the information content of the universe, [[Space-time continuum|through all time and space]] is ''as good as'' infinite<ref>This assumes there is not a finite end-point to the universe; by no means settled cosmology, but hardly a rash assumption. And given how little we have of it, the universe’s total information content ''might as well be'' infinite, when compared to our finite collection of mortal data. Even the total, ungathered-by-mortal-hand, information content generated by the whole universe ''to date'', not even counting the unknowable future, is as good as infinite.</ref> and the data ''homo sapiens'' has collected to date is necessarily finite<ref>[[There is no data from the future]].</ref> (even counting what we’ve lost along the way), it follows that the total value of our [[data]] — in which Professor Fisher would have us trust — is, like any other finite number divided by infinity, ''mathematically nil''.
 
And that is before you consider the ''quality'' of our data. If 90% of all gathered data originates from the internet age,<ref>Eric Schmidt said something like this in 2011, and it sounds [https://blog.rjmetrics.com/2011/02/07/eric-schmidts-5-exabytes-quote-is-a-load-of-crap/ totally made up], but let’s run with it, hey?</ref> a good portion of our summed human knowledge comprises cat videos, [[The Jolly Contrarian:About|self indulgent wikis]] and hot takes on [[Twitter]] — so is ''shite'' data, even on its own terms.<ref>[[Get off Twitter]], okay? For all of our sakes.</ref>
 
In any case, it follows that, should we transcend our meagre [[hermeneutic]] bubbles, and free the incarcerate race of {{sex|man}}, so to speak, the ratio of ''our'' data — good, bad, indifferent — to ''all possible data in the universe, past and future'' out there is ''infinitesimal''.<ref>That means, ''really'' small.</ref>
 
If this is what we’re meant to trust, you might ask what is so wrong with God. We are [[The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning - Book Review|pattern-seeking machines]]. It’s not like we take the data as we find them, coolly fashioning objective axioms from them, carving nature at its joints: we bring our idiosyncratic prisms and pre-existing cognitive structures to the task —our own “hot takes” — and wantonly create patterns to support our pre-existing convictions.
 
This is not a criticism as much as a piece of resignation: an observation. ''This'' is the doom our incarcerate race endures.
 
It is not just the Twitterati. Science, too, has its [[confirmation bias]]es at a meta-level, uncontrollable even by double-blind testing methodologies.  Experiments which ''confirm'' a hypothesis are ''a lot'' more likely to be published than those which ''don’t''.<ref>{{br|The Hidden Half: How the World Conceals its Secrets}}, by Michael Blastland.</ref> Of those failed experiments that ''are'' published, far fewer are cited in other literature. [[Falsification]]s ''die''.
 
It happens in any, and every social structure: even [[woke]] ones. Whoever occupies a position of standing or influence in a [[power structure]], wields the tremendous power to ''ignore'' ideas which don’t suit her own predilections. As long as the power structure is not in crisis, “inconvenient”, “awkward”, “disruptive” suggestions are a distraction; they divert resources from the true path; they undermine the existing programme, and those who are conducting it. It is much easier to overlook the direction a new suggestion might take than engage energy, time and resources in a fight you might end up losing. We all do it. Ask Kodak: someone there ''invented the digital camera''. Management looked the other way: it didn’t suit the path the firm was on.
 
[[Falsification]]s ''die''.
 
This is neither a cause for alarm nor is it new. It is just a reminder how important, in all human discourse, is [[Contingencies|contingency]], provisionality, and above all ''humility''. ''Your data is likely bunk''.
 
All of these are another way of attacking a familiar problem: the universe, the world, the nation, your market, your workplace and even your interpersonal relationships are [[complex]], not just [[complicated]]. Mere [[complication]] is a ''function'' of a [[paradigm]]. It is part of the game. It is within the rules. It is soluble, by sufficiently skilled application of the rules. Complication can be beaten by an algorithm. You ''can'' brute force it.
 
[[Complexity]], you cannot.
 
[[Complexity]] describes the ''limits'' of the [[narrative]]. [[Complexity]] is the wilderness ''beyond'' the [[rules of the game]]. [[Complexity]] inhabits the noise, not the signal. Where there is complexity, ''algorithmic rules do not work''. Here ''data'' is relegated to ''noise''.<ref>Provisional theory:  “information” is [[data]] framed with a hypothesis.</ref>
 
This is why physical sciences apparently have a greater success than social sciences: they ask themselves easier questions: Physical sciences generally address behaviours of independent events — rolling balls, [[Coin flip|flipping coins]], waves [[and/or]] particles of light. But rolling balls are not autonomous agents. They act independently. The behaviour of one will not influence that of another. Each [[coin flip]] is, as a condition of probability theory — independent.<ref>The technical term: “platykurtic”.</ref> Independent events obey Gaussian principles. They may be modelled. That is to say, they may be [[complicated]] but they remain predictable, at least in theory. When physical systems inexplicably go bang — Chernobyl, the Shuttle ''Challenger'', the ''Torrey Canyon'' — the [[root cause]] will not be a failure of the physical science underlying the engineering, but some supervening cause invalidating the underlying assumptions on which the physical science was based. Things go bang because of [[non-linear interaction|''non-linear'' interactions]].
 
[[Social science]]s don’t have that get-out-of-jail-free card: they address precisely that kind of supervening cause: behaviour that is, intrinsically, ''un''predictable. Psychology, sociology, anthropology, economics — these concern themselves with human agents, who ''are'' influenced by each other — which is why we don’t use physical science to predict their behaviour. Social sciences have to deal with the inherently complex, non-Gaussian interactions between human beings.<ref>physical sciences set up closed logical systems within which their rules will work, and often these systems are dramatically simplified as compared with anything you see in the real world: Newton, for example, assumes a frictionless, stationery, stable, neutral frame of reference: circumstances which, in any observed environment, do not and ''cannot'' not exist. {{author|Nancy Cartwright}} calls these structures “[[nomological machine]]s”. Because of this explicit caveat, we can put any variances between Newton’s prediction and the observed outcome down not to [[falsification]], but to the messy real world “contaminating” the idealised experimental conditions. Hence, the proverbial [[crisp packet blowing across St Mark’s Square]].</ref>
 
===Behaviourism and {{br|The Ghost in the Machine}}===
Now it wasn’t always like that. Fifty years ago psychologists were waging a battle royale against the positivist branch of their own discipline, which insisted on on proceeding by reference, exclusively, to “public events” and ignoring private mental events. Can you imagine it: a ''psychology'' which ''ignores private mental events''? Can you imagine an approach to artificially reconstructing natural intelligence which ignores private mental events?
 
{{Quote|On the strength of this doctrine, the Behaviorists proceeded to purge psychology of all intangibles and unapproachables. The terms ‘consciousness’, ‘mind’, ‘imagination’ and ‘purpose’, together with a score of others were declared to be unscientific, treated as dirty words, and banned from the vocabulary. ... <Br>
It was the first ideological purge of such a radical kind in the domain of scientists, predating the ideological purchase in totalitarian politics, but inspired by the same single-mindedness true fanatics.<Br>
—Arthur Koestler, {{br|The Ghost in the Machine}}}}
 
You might ask what has changed, for it seems that the contemporary interest in in neural networks, big data and natural language processing, all of which eschew the intentional fallacy, adopt ''exactly'' the Behaviourist disposition. Don’t they? On one hand, they have no choice: if human psychologists are struggling to understand how consciousness works in situ, ''in the actual mesh of living veins, in cell of padded bone'', is it any wonder people looking at its proxy in a digital network might not bother?
 
{{Sa}}
*[[Hindsight]]
*{{br|The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning}}
*[[nomological machine]]
*[[Complexity]]
*[[Systems theory]]
{{ref}}