Template:M intro design System redundancy: Difference between revisions

no edit summary
No edit summary
Tags: Mobile edit Mobile web edit
No edit summary
Line 1: Line 1:
{{Quote|A form of modernity, characterized by an unfaltering confidence in science and technology as means to reorder the social and natural world.
{{Quote|A form of modernity, characterised by an unfaltering confidence in science and technology as means to reorder the social and natural world.
:—Wikipedia, on ''High Modernism''}}
:—Wikipedia, on ''High Modernism''}}


[[System redundancy|One of the]] [[JC]]’s favourite theories is that the commercial world in in the throes of a doomed love affair with by a kind of computer-adulterated [[high modernism]] call it “[[data modernism]]” — that holds that, just as the natural world can be ordered by science, so can the business world be ordered, and therefore controlled, by ''[[process]]'' — process being a kind of [[algorithm]], only one that that runs on a carbon and not a silicon [[substrate]]. I.e., us.
[[System redundancy|One of the]] [[JC]]’s favourite theories is that western commerce, especially the [[Financial services|parts involved with the movement of green bits of paper]], is deep into the regrettable phase of a doomed love affair with a computer-adulterated form of [[high modernism]]. We call this “[[data modernism]]” and ascribe to it the following view: just as the natural world can be ordered by science, so can the business world be ordered, and controlled, by ''[[process]]''. [[Process]] is a sort of [[algorithm]] that runs on a carbon and not a silicon [[substrate]]. I.e., us.
====Bring your own job satisfaction====
[[Data modernism]] has systematically undermined the significance in the organisation of ''those with inneffable expertise''. As a result, the poor professional has been, by thousands of cuts — literally — denuded of her status. In a slow, but inevitable, descent into the quotidian, she has been expected to supply her ''own'' accoutrements: do-it-yourself typing; [[bring your own device]] — and the same time that once commodious office became communal, then lost its door, then its walls, diminished to a dedicated space along a row, and most recently has become a conditional promise of a sanitised space at a [[telescreen]] somewhere in the building, assuming you’re quick or enough people are out sick or on holiday.  


The metaphor works if we consider ourselves to be carbon-based Turing machines. A firm, company or association is materially the same as a distributed network of computers.
This systematic deprecation of [[expert|expertise]] is a logical consequence of [[data modernism]]: human “magic” is not good, but risky, evanescent, fragile, expensive, inconstant and, most of all, ''hard to quantify'' — and if can’t quantify it, you can’t evaluate it, and if you can’t evaluate it you shouldn’t, in a data-optimised world, ''do'' it.
====Sciencing the shit out of business====
The [[metaphor]] works best if we consider the workforce to be carbon-based Turing machines. Such a distributed network is best optimised centrally, and from the place with the best view of the big picture: the top.<ref>curiously, this is not the theory behind a distributed network of computers, which is rather [[end-to-end principle|controlled from the edges]]. But still.</ref> All relevant information can be articulated as [[data]] — you know: “[[Signal-to-noise ratio|In God we trust, all others must bring data]]” — and, with enough data everything about the organisation’s present can be known and its future extrapolated: this is the promise of science and technology.<ref>It isn’t. It really, really isn’t. But still.</ref>


The theory continues that such a distributed network of carbon machine is best controlled centrally, and from the place with the best view of the big picture: the top.<ref>curiously, this is not the theory behind a distributed network of computers, which is rather [[end-to-end principle|controlled from the edges]]. But still.</ref> All relevant information can be articulated as data — you know: “[[Signal-to-noise ratio|In God we trust, all others must bring data]]” — and, with enough data everything about the organisation’s present can be known and its future extrapolated. The organisation’s permanent infrastructure should be honed down and dedicated to  its core business, and its peripheral activity — [[operation]]s, [[personnel]], [[legal]] and ''~ cough ~'' strategic [[management consultant|management advice]] — outsourced to specialist [[service provider]]s who can be scaled up or down as requirements dictate, or switched out altogether should they be malfunctioning or otherwise surplus to requirements.  
The organisation’s permanent infrastructure should be honed down and dedicated to  its core business, and its peripheral activity — [[operation]]s, [[personnel]], [[legal]] and ''~ cough ~'' strategic [[management consultant|management advice]] — outsourced to specialist service providers who can be scaled up or down as requirements dictate,<ref>“Surge pricing” in times of crisis, though.</ref> or switched out altogether should they malfunction or otherwise be surplus to requirements.  


This philosophy, espoused as it is by ''~ cough ~'' strategic management advisors — can seem self-serving. It recommends maximising the efficient allocation of company resources. It is responsible for a generational drift from inefficient businesses run arbitrarily by unionised humans to enterprises run like machines: infinitesimally-sliced ''processes'', each triaged and managed by a programmed, automated applications, with minimal human oversight, provided by external service providers. Business became “business-process-as-a-service”.
This philosophy of optimally efficient allocation of resources, espoused as it is by ''~ cough ~'' strategic [[management consultant| management advisors]] — can seem self-serving. It is responsible for a generational drift from inefficient businesses run arbitrarily by unionised humans to enterprises run like unblinking machines: infinitesimally-sliced ''processes'', each [[triage]]d and managed by pre-automated applications, with what minimal human oversight there is provided by external service providers in low-cost locations.  


It feels like we are in a new and better world while customer experience feels as grim as ever. “BAU-as-a-service” has streamlined and enhanced the great heft what businesses do, at the cost of outlying opportunities for which the model says there is no business case. We call this effect “[[Pareto triage]]”. Great, for the huddled masses who just want the normal thing. But it poorly serves the long tail of oddities and opportunities. Those just beyond that “[[Pareto triage|Pareto threshold]]” have little choice but to manage their expectations and take a marginally unsatisfactory experience as the best they are likely to get. Customers subordinate their own priorities to the preferences of the model. This is a poor business outcome. And, unless you are McDonald’s, the idea that 80% of your customers ''want'' exactly the same thing — as opposed to being prepared to put up with it in, the absence of a better alternative — is a kind of wishful [[averagarianism]].
''Business'' became “business-process-as-a-service”.


Even though, inevitably, one has less than perfect information, extrapolations, mathematical derivations and [[Large language model|algorithmic pattern matches]] from a large but finite data set will have better predictive value than the gut feel of “[[ineffable]] expertise”: the status we have historically assigned to experienced experts is grounded in folk psychology, lacks analytical rigour and, when compared with sufficient granular data, cannot be borne out: this is the lesson of {{br|Moneyball: The Art of Winning an Unfair Game}}. Just as Wall Street data crunchers can have no clue about baseball and still outperform veteran talent scouts, so can data models and analytics who know nothing about the technical details of, say, the law outperform humans who do when optimising business systems. Thus, from a network of programmed but uncomprehending rule-followers, a smooth, steady and stable business revenue stream [[emerge]]s.
We should, by now, feel like we are in a new and better world — right? — yet customer experience feels worse than ever. Just try getting hold of a bank manager now. “BAU-as-a-service” has streamlined and enhanced the great heft what businesses do, at the cost of outlying opportunities for which the model says there is insufficient business case.
====Pareto triage====
We call this effect “[[Pareto triage]]”. Great, for the huddled masses who just want the normal thing. But it poorly serves the long tail of oddities and opportunities. Those just beyond that “[[Pareto triage|Pareto threshold]]” have little choice but to manage their expectations and take a marginally unsatisfactory experience as the best they are likely to get. Customers subordinate their own priorities to the preferences of the model. This is a poor business outcome. And, unless you are McDonald’s, the idea that 80% of your customers ''want'' exactly the same thing — as opposed to being prepared to put up with it in, the absence of a better alternative — is a kind of wishful [[averagarianism]].
====The Moneyball effect: experts are bogus====
It gets worse for the poor old [[subject matter expert]]s. Even though, inevitably, one has less than perfect information, extrapolations, mathematical derivations and [[Large language model|algorithmic pattern matches]] from a large but finite data set will, it is ''deduced'' — have better predictive value than the gut feel of “[[ineffable]] [[expert]]ise”.  


Since the world overflows with data, we can programmatise business. Optimisation is a mathematical problem to be solved. It is a [[knowable unknown]]. To the extent we fail, we can put it down to not enough data or computing power.
The status we have historically assigned to experienced experts is grounded in folk psychology, lacks analytical rigour and, when compared with sufficient granular data, cannot be borne out: this is the lesson of {{br|Moneyball: The Art of Winning an Unfair Game}}. Just as Wall Street data crunchers can have no clue about baseball and still outperform veteran talent scouts, so can data models and analysts who know nothing about the technical details of, say, the ''law'' outperform humans who do when optimising business systems. Thus, from a network of programmed but uncomprehending rule-followers, a smooth, steady and stable business revenue stream [[emerge]]s. Strong and stable. Strong and stable. Repeat it enough and it sounds plausible.
 
Since the world overflows with data, we can programmatise business. Optimisation is now just a hard mathematical problem to be solved and, now we have computer processing power to burn, it is a [[knowable unknown]]. To the extent we fail, we can put it down to not enough data or computing power — ''yet''. But the singularity is coming, soon.
====The persistence of rubbish====
It’s worth asking again: if we’re getting nearer some kind of optimised nirvana, how come everything seems so joyless and glum?


Since data quantity and computing horsepower have exploded in the last few decades, the [[high-modernist]]s have grown ever surer that their time — the [[Singularity]] — is nigh. Before long, and everything will be solved.
Since data quantity and computing horsepower have exploded in the last few decades, the [[high-modernist]]s have grown ever surer that their time — the [[Singularity]] — is nigh. Before long, and everything will be solved.


But, a curious dissonance: these modernising techniques arrive and flourish, while traditional modes of working requiring skill, craftsmanship and tact are outsourced, computerised, right-sized and AI-enhanced — but yet the end product gets no less cumbersome, no faster, no leaner, and no less risky. There may be fewer [[subject matter expert]]s around, but there seem to be more [[software-as-a-service]] providers, [[Master of Business Administration|MBA]]s,  [[COO]]s, [[workstream lead]]s and [[Proverbial school-leaver from Bucharest|itinerant school-leavers in call-centres on the outskirts of Brașov]]  
But, a curious dissonance: these modernising techniques arrive and flourish, while traditional modes of working requiring skill, craftsmanship and tact are outsourced, computerised, right-sized and AI-enhanced — but yet the end product gets no less cumbersome, no faster, no leaner, and no less risky. There may be fewer [[subject matter expert]]s around, but there seem to be more [[software-as-a-service]] providers, [[Master of Business Administration|MBA]]s,  [[COO]]s, [[workstream lead]]s and [[Proverbial school-leaver from Bucharest|itinerant school-leavers in call-centres on the outskirts of Brașov]]  
 
====Taylorism====
The pioneer of this kind of modernism was [[Frederick Winslow Taylor]]. He was the progenitor of the maximally efficient production line. His inheritors say things like, “[[The Singularity is Near|the singularity is near]]” and “[[Software is eating the world|software will eat the world]]” but for all their millenarianism the on-the-ground experience at the business end of this all world-eating software is as grim as it ever was.
Done of this is new: just our enthusiasm for it. The prophet of [[data modernism]] was [[Frederick Winslow Taylor]], progenitor of the maximally efficient production line. His inheritors say things like, “[[The Singularity is Near|the singularity is near]]” and “[[Software is eating the world|software will eat the world]]” but for all their millenarianism the on-the-ground experience at the business end of this all world-eating software is as grim as it ever was.
 
====Time====
We have a theory that this “data reductionism” reducing everything to quantisable inputs and outputs — owes tends to a kind of [[reductionism]], only about ''time'': just as radical rationalists see all knowledge as reducible to, and explicable in terms of, its infinitesimally small sub-atomic essence, so the data modernists see it as explicable in terms of infinitesimally small windows of ''time''.
We have a theory that in reducing everything to quantifiable inputs and outputs — data modernism tends to a kind of data ''[[reductionism]]'', only about ''time'': just as radical rationalists see all knowledge as reducible to, and explicable in terms of, its infinitesimally small sub-atomic essence, so the data modernists see it as explicable in terms of infinitesimally small windows of ''time''.


This is partly because computer languages don’t do [[tense|''tense'']]: they are coded in the present, and have no frame of reference for continuity. Whereas existential continuity backwards and forwards in “time” is precisely the trick that the human brain plays: this is the thing that demands “things”, just one of which is “me”, moving through a spatio-temporal universe, interacting with each other and hence requiring definitive boundaries. <ref>None of these things are necessary, or even coherent, without a sense of continuous time. Hence any [[difference engine]] that can operate wholly without needing a concept of continuity won’t evolve consciousness why would it? I have put some thoughts together on that [[Code and language|here]]. I should say this is all my own work and is likely therefore to be nonsense.</ref> And it is partly because having to cope with history, the passage of time, and the continued existence of objects, makes things exponentially more complex than they already are. An atomically thin snapshot of the world as data is enough of a beast to be still well beyond the operating parameters of even the most powerful quantum machines: that level of detail extending into the future and back from the past is, literally, infinitely more complicated. The modernist programme is to suppose that “time” is really just comprised of billions of infinitesimally thin, static slices, each functionally identical to any other, so by measuring the [[delta]] between them we have a means of handling that complexity.
This is partly because computer languages don’t do [[tense|''tense'']]: they are coded in the present, and have no frame of reference for continuity. Whereas existential continuity backwards and forwards in “time” is precisely the trick that the human brain plays: this is the thing that demands “things”, just one of which is “me”, moving through a spatio-temporal universe, interacting with each other and hence requiring definitive boundaries. <ref>None of these things are necessary, or even coherent, without a sense of continuous time. Hence any [[difference engine]] that can operate wholly without needing a concept of continuity won’t evolve consciousness why would it? I have put some thoughts together on that [[Code and language|here]]. I should say this is all my own work and is likely therefore to be nonsense.</ref> And it is partly because having to cope with history, the passage of time, and the continued existence of objects, makes things exponentially more complex than they already are. An atomically thin snapshot of the world as data is enough of a beast to be still well beyond the operating parameters of even the most powerful quantum machines: that level of detail extending into the future and back from the past is, literally, infinitely more complicated. The modernist programme is to suppose that “time” is really just comprised of billions of infinitesimally thin, static slices, each functionally identical to any other, so by measuring the [[delta]] between them we have a means of handling that complexity.