Averagarianism

From The Jolly Contrarian
Revision as of 08:08, 30 October 2022 by Amwelladmin (talk | contribs)
Jump to navigation Jump to search


In which the curmudgeonly old sod puts the world to rights.
Index — Click ᐅ to expand:

Comments? Questions? Suggestions? Requests? Insults? We’d love to 📧 hear from you.
Sign up for our newsletter.

Rory Sutherland has an excellent snippet about the danger of managing toward averages. Among his reasons:

  • The average — the top of the bell curve— is where everyone will be targeting their product, so existing markets will be mature, barriers to entry high, and margins will be the slimmest. Go for the tails, find the influencers and meet them drive your product into the mainstream. Have the average follow you, not the other way around.
  • Convergence on the same place everyone is converging isn’t good business, but a recipe for bankruptcy. It is a race to the bottom. As with evolution, the secret is to realise the process is a continuous drift from the unsatisfactory status quo to something else that doesn’t have that drawback, as opposed to a process converging on a consensus. The ecosystem is not seeking an equilibrium. It is perpetually seeking to escape it.

Data modernism and the cult of the aggregate

A prelude to the great delamination: There is a strand of modernist thinking that flows from Robert Moses, Le Corbusier, that there is an optimisable configuration for human interaction and it can be derived from a rigorously scientific, or at least mathematical, method: that the only obstacle to implementing it has been the lack of a sufficiently powerful machine to run the calculation.

That time has now arrived, or is close at hand, whereby the means is at our disposal. We now have the processing power to take massive amounts of unstructured data — “noise” in the vernacular — and from it extrapolate a signal. We don’t necessarily understand how the algorithms extrapolate a signal; they just do — this inscrutability is part of the appeal of it: there is no “all-too-human” bias[1] — but there is a belief which stretches from paid-up Randian anarcho-capitalists through to certified latter-day socialists, that we can solve our problems with data.

Now data, as it comes, is an incoherent, imperfect, meaningless thing. It is the pre-theatre chat; a “hubbub”: made up of millions of individual communications, conversations and interactions actions, all of which have their own (possibly imperfect) meanings between their participants, but which taken as a whole have no particular meaning at all.

Imagine taking every one of the pre-performance conversations between all the patrons at the Saturday matinee performance of Eureka Day at The Old Vic[2] — that meaningless hubbub — and summarising it into a single sentence, designed to reflect what “the theatre was thinking”. Then you feed that single confabulated sentence back to all the theatre patrons and say “this is the conversation which the theatre was having. Now, which side were you on?” People will tend to take sides, and will invest themselves in that conversation.

But, remember, the hubbub was just noise all along. None of the individual conversations had anything to do with each other. All had their own, independent meanings. They are immune to aggregation.

We say “we have unconscious biases and they inform our reactions”. Well, no shit.

To extract signal from noise is to filter, limit compress and selectively amplify on the predication that there is a signal; that that hubbub is something like a de-tuned radio, or we are looking for pulsars, quasars and intelligent life on the SETI array. But we are not. There isn’t always a signal. the SETI array is a bad metaphor: here we are trying to tease out a bilateral signal that is there from a spectrum of other kinds of radiation that qualitatively different, but just broadcast on the same frequency. With the human hubbub there are a spectrum of unconnected communications and no real “signal”. We are not trying to isolate a single conversation out of all the other ones — that is the direct analogy — but trying to extract a an aggregated message that is not actually there, and to treat is as an emergent property of all those conversations. This is a different thing entirely. There is no emergent property from millions of unrelated conversations. The result is brown, warm and even: maximum entropy.

To make something out of nothing is to deliberately bias. It is to carve David out of a marble block. Bias creates meaning. There may be local meanings — maybe — based on local interactions and echo chambers but these are informal, incomplete, and impossible to delimit.

So we tend to “extrapolate” central figures from random noise: economic growth. The intention behind expressed electoral preference. Average wages. The wage gap. Why the stock market went up. That the stock market went up: these are spectral figures. They are ghosts, gods, monsters and devils. They are no more real than religions, just because they are the product of “science” and “techne”.

We have, on occasion, some convenient proxies, but they are just proxies: for example, in an election, a manifesto. Without a manifesto, a binary vote for a single candidate in a local electorate (I am assuming FPP, but in honesty it isn’t wildly different for proportional represerntation) tells us nothing whatever about the individual motivation to vote as she did. A manifesto helps, by a process of deemery.

Did every Conservative voter read the party’s manifesto? Almost certainly, no. Did every Conservative voter who did read it subscribe to every line? Again, almost certainly no. Did anyone subscribe to every line in it? Perhaps, but by no means certainly. So, can we legitimately infer uniform support for the Conservatives’ manifesto from all who voted Conservative? No. We only do by dint of the political convention that those who vote for a party are deemed to support a manifesto (if one is published). But even that convention is a spectre. And where your vote is an issue-based referendum, there is not even a manifesto. Who knows why 33 million people voted for Brexit? Who could possibly presume to aggregate all those individual value judgments into a single guiding principle? There were 33 million reasons for voting leave. They tell us nothing except... leave.

But yet the delaminated Onworld — especially as it feeds back its simplified “signal” and thereby amplifies it — we draw our battle lines and attack based on these, invented, signals. We take them, and make them our own. We truck in archetypes of our own devising.[3] Trans activists fight for the rights of — and here, I confess immediately, I am doing exactly what I complain of — exotic, beautiful, fragile, elfen, teen-age dolphin-like creatures of beguiling androgyny and harmlessness, as if all trans-identifying people are like that. On the other hand, gender-critical activists fight against middle-aged male sex-offenders operating under cover, as if all trans people are like that.

Yet such a patently ludicrous argument animates the public square. This is no more real than vampires fighting werewolves. Why do we take it anymore seriously.

Hence the delamination: the online world is a world of extruded ghoulish signals aggregated from the unfiltered noise of discourse. The offline world — can we call it the offworld? — is a world of bilateral conversations, one on one. A world of shades, nuance, detail, richness, complexity's and — for the most part — civility.


Feedback loopsand feeding that signal back into the memeplex, without necessarily surveilling it or taking anything out of it. So it would include machine learning, AI, etc==See also==

References

  1. At least, until the algo goes rogue and becomes a Nazi.
  2. Real-life example, needless to say.
  3. Our personal conceptualisations of archetypes never quite map to the world: the “Google Disappointment Effect” when an image search (or AI prompt) never quite returns the image you had in mind. This is the variation of the “no average fighter pilot” effect.