Template:M intro design System redundancy: Difference between revisions

no edit summary
Tags: Mobile edit Mobile web edit
No edit summary
 
(26 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{Quote|A form of modernity, characterised by an unfaltering confidence in science and technology as means to reorder the social and natural world.
{{quote|{{d|High modernism|haɪ ˈmɒdᵊnɪzᵊm|n}}
:—Wikipedia, on ''High Modernism''}}
A form of modernism characterised by an unfaltering confidence in science and technology as means to reorder the social and natural world.}}
==1. Data modernism==
[[System redundancy|One of the]] [[JC]]’s pet theories is that western commerce — especially the [[Financial services|part concerned with moving green bits of paper around]] — is deep into the regrettable phase of a love affair with “[[data modernism]]”, a computer-adulterated form of [[high modernism]].


[[System redundancy|One of the]] [[JC]]’s pet theories is that western commerce — especially the [[Financial services|parts concerned with moving green bits of paper around]] — is deep into the regrettable phase of a love affair with “[[data modernism]]”, our own term for a computer-adulterated form of [[high modernism]].
{{High modernism capsule}}


Just as the natural world can be ordered by science, so can the business world be ordered, and controlled, by ''[[process]]''. [[Process]] is a sort of [[algorithm]] that runs on a carbon and not a silicon [[substrate]]: that is, ''us''.
Embraced as it was by the two great Utopian ideologies of the twentieth century, high modernism reached its zenith in the 1930s, and fell from grace as dramatically as they did, but its basic, mad-scientist premise — that with sufficient information, processing power and control we can master our domain — has never quite gone away.
 
There were two Utopian ideas that died in the twentieth century, and one that didn’t:  [[F. W. Taylor]]’s [[scientific management]]: the view that, just as the natural world can be ordered by science, so can the business world be ordered, and controlled, by ''[[process]]''. [[Process]] is a sort of [[algorithm]] that runs on a carbon and not a silicon [[substrate]]: that is, ''us''. [[Taylorism]], too, once out of favour, is making a revival in the networked, data-driven world. We call this “[[data modernism]]”. It is encapsulated in the expression, attributed to Edwin R. Fisher:
{{quote|In God we trust. All others must bring data.<ref>Notably, Fisher made the statement to a Senate subcommittee in rebuttal to the proposition that passive smoking is bad for you: “I should like to close by citing a well-recognised cliché in scientific circles. The cliché is, “In God we trust, others must provide data.” What we need is good scientific data before I am willing to accept and submit to the proposition that smoking is a hazard to the nonsmoker.”</ref>}}


====Bring your own job satisfaction====
====Bring your own job satisfaction====
[[Data modernism]] has systematically undermined the significance in the organisation of ''those with inneffable expertise''. As a result, the poor professional has been, by thousands of cuts — literally — denuded of her status. In a slow, but inevitable, descent into the quotidian, she has been expected to supply her ''own'' accoutrements: do-it-yourself typing; [[bring your own device]] and the same time that once commodious office became communal, then lost its door, then its walls, diminished to a dedicated space along a row, and most recently has become a conditional promise of a sanitised space at a [[telescreen]] somewhere in the building, assuming you’re quick or enough people are out sick or on holiday.  
In pitting ''information'' against ''experience'' [[data modernism|this philosophy]] has systematically undermined the importance in organisations of ''those with [[ineffable]] expertise''. They are called “[[Subject matter expert|subject-matter experts]]”, which sounds venerable until you hear the exasperated tone in which it is uttered.
 
Over forty years the poor [[SME]] has, by a thousand literal cuts, been stripped of her status and weathered a slow but inevitable descent into the quotidian: first they came for her assistants — typists, receptionists, proof-readers, mail and fax room attendants —  then her perks — business travel, away days, taxis home — then her kit — company cars, laptops, mobile devices — then her space that once commodious office became communal, then lost its door, then its walls, diminished to a dedicated space along a row, and most recently has become a conditional promise of a sanitised white space in front of a [[telescreen]] somewhere in the building, should you be in early, or enough of your colleagues away sick or on holiday.
 
This managed degradation of [[expert|expertise]] is a logical consequence of [[data modernism]]: human “magic” is not good, but an evil that is no longer necessary: risky, inconstant, evanescent, fragile, expensive, inconstant and, most of all, ''hard to quantify'' — and what you can’t quantify, you can’t evaluate, and what you can’t evaluate you shouldn’t, in a data-optimised world, ''do''.
 
With the exploding power of information processing the range of things for which we must still rely on that [[Subject matter expert|necessary evil]] has diminished. Many [[thought leader]]s<ref>The most prominent is [[Ray Kurzweil]], though honourable mention to DB’s former CEO John Cryan and, of course, there is the redoubtable [[Richard Susskind|Suss]]. </ref> foretell it is only a matter of time until there are none left at all.


This systematic deprecation of [[expert|expertise]] is a logical consequence of [[data modernism]]: human “magic” is not good, but risky, evanescent, fragile, expensive, inconstant and, most of all, ''hard to quantify'' — and if can’t quantify it, you can’t evaluate it, and if you can’t evaluate it you shouldn’t, in a data-optimised world, ''do'' it.
====Sciencing the shit out of business====
====Sciencing the shit out of business====
The [[metaphor]] works best if we consider the workforce to be carbon-based Turing machines. Such a distributed network is best optimised centrally, and from the place with the best view of the big picture: the top.<ref>curiously, this is not the theory behind a distributed network of computers, which is rather [[end-to-end principle|controlled from the edges]]. But still.</ref> All relevant information can be articulated as [[data]] — you know: “[[Signal-to-noise ratio|In God we trust, all others must bring data]]” — and, with enough data everything about the organisation’s present can be known and its future extrapolated: this is the promise of science and technology.<ref>It isn’t. It really, really isn’t. But still.</ref>  
[[Data modernism]]’s central [[metaphor]] works by treating human workers as if they were carbon-based [[Turing machine]]s, and “the firm” an orchestrated network of automatons. Orchestration happens centrally, from the place with the best view of the big picture: the top.<ref>Curiously, this is not the theory behind distributed computing, which is rather [[end-to-end principle|controlled from the edges]]. But still.</ref>  


The organisation’s permanent infrastructure should be honed down and dedicated to  its core business, and its peripheral activity — [[operation]]s, [[personnel]], [[legal]] and ''~ cough ~'' strategic [[management consultant|management advice]] — outsourced to specialist service providers who can be scaled up or down as requirements dictate,<ref>“Surge pricing” in times of crisis, though.</ref> or switched out altogether should they malfunction or otherwise be surplus to requirements.  
From the top, the only “[[Legibility|legible]]” information is [[data]], so all germane  [[Management information and statistics|management information]] takes that form — you know: “[[Signal-to-noise ratio|In God we trust, all others must bring data]]”. With enough of the stuff, so the theory goes, everything about the organisation’s ''present'' can be known, and from a complete picture of the present one can extrapolate to the future.  


This philosophy of optimally efficient allocation of resources, espoused as it is by ''~ cough ~'' strategic [[management consultant| management advisors]] — can seem self-serving. It is responsible for a generational drift from inefficient businesses run arbitrarily by unionised humans to enterprises run like unblinking machines: infinitesimally-sliced ''processes'', each [[triage]]d and managed by pre-automated applications, with what minimal human oversight there is provided by external service providers in low-cost locations.  
Armed with all the [[Signal-to-noise ratio|data]], the organisation’s permanent infrastructure can be honed down and dedicated to its core business. Peripheral functions — [[operation]]s, [[personnel]], [[legal]] and ''~ cough ~'' strategic [[management consultant|management advice]] — can be [[Outsourcing|outsourced]] to specialist service providers, and then scaled up or down as management priorities dictate<ref>“Surge pricing” in times of crisis, though.</ref> or switched out should they malfunction or otherwise be surplus to requirements.<ref>A former general counsel of UBS once had the bright idea of creating a “shared service” out of its legal function that could be contracted out to other banks, like [[Credit Suisse]]. He kept bringing the idea up, though it was rapidly pooh-poohed each time. Who knew it would work out so well in practice?</ref>


''Business'' became “business-process-as-a-service”.
We should, by now, feel like we are in a new and better world — right? — yet the customer experience as poor as ever. Not ''worse'', necessarily: just no ''better''. Just try getting hold of a bank manager now. “BAU-as-a-service” has streamlined and enhanced the great heft what businesses do, at the cost of depersonalising service and eliminating outlying opportunities for which the model says there is no business case.
 
You might call this effect “[[Pareto triage]]”. Great for those within a [[Normal distribution|standard deviation]] of the mean, who are happy with the average. But it poorly serves the long tail of oddities and opportunities. Those just beyond that “[[Pareto triage|Pareto threshold]]” have little choice but to manage their expectations and take a marginally unsatisfactory experience as the best they are likely to get. Customers subordinate their own priorities to the preferences of the model. But, unless you are McDonald’s, the proposition that 80% of your customers ''want'' exactly the same thing — as opposed to just being prepared to put up with it, in the absence of a better alternative — is a kind of wishful [[averagarianism]].


We should, by now, feel like we are in a new and better world — right? — yet customer experience feels worse than ever. Just try getting hold of a bank manager now. “BAU-as-a-service” has streamlined and enhanced the great heft what businesses do, at the cost of outlying opportunities for which the model says there is insufficient business case.
====Pareto triage====
We call this effect “[[Pareto triage]]”. Great, for the huddled masses who just want the normal thing. But it poorly serves the long tail of oddities and opportunities. Those just beyond that “[[Pareto triage|Pareto threshold]]” have little choice but to manage their expectations and take a marginally unsatisfactory experience as the best they are likely to get. Customers subordinate their own priorities to the preferences of the model. This is a poor business outcome. And, unless you are McDonald’s, the idea that 80% of your customers ''want'' exactly the same thing — as opposed to being prepared to put up with it in, the absence of a better alternative — is a kind of wishful [[averagarianism]].
====The Moneyball effect: experts are bogus====
====The Moneyball effect: experts are bogus====
It gets worse for the poor old [[subject matter expert]]s. Even though, inevitably, one has less than perfect information, extrapolations, mathematical derivations and [[Large language model|algorithmic pattern matches]] from a large but finite data set will, it is ''deduced'' — have better predictive value than the gut feel of “[[ineffable]] [[expert]]ise”.  
In the mean time, those [[subject matter expert]]s who don’t drop off altogether wither on the vine. Even though we have less than perfect information, algorithmic extrapolations, derivations and [[Large language model|pattern matches]] from what ever we do have are presumed to yield greater predictive value than any [[subject matter expert]]s’ “[[ineffable]] wisdom”.  


The status we have historically assigned to experienced experts is grounded in folk psychology, lacks analytical rigour and, when compared with sufficient granular data, cannot be borne out: this is the lesson of {{br|Moneyball: The Art of Winning an Unfair Game}}. Just as Wall Street data crunchers can have no clue about baseball and still outperform veteran talent scouts, so can data models and analysts who know nothing about the technical details of, say, the ''law'' outperform humans who do when optimising business systems. Thus, from a network of programmed but uncomprehending rule-followers, a smooth, steady and stable business revenue stream [[emerge]]s. Strong and stable. Strong and stable. Repeat it enough and it sounds plausible.
This is the {{br|Moneyball}} lesson. Our veneration for human expertise is a misapprehension. It is, er, ''not borne out by the data''. And in the twenty-first century
we are ''inundated'' with data. Business optimisation is just a hard mathematical problem. Now we have computer processing power to burn, it is a [[knowable unknown]]. To the extent we fail, we can put it down to not enough data or computing power — ''yet''. But the [[singularity]] is coming, soon.


Since the world overflows with data, we can programmatise business. Optimisation is now just a hard mathematical problem to be solved and, now we have computer processing power to burn, it is a [[knowable unknown]]. To the extent we fail, we can put it down to not enough data or computing power — ''yet''. But the singularity is coming, soon.
====The persistence of rubbish====
====The persistence of rubbish====
It’s worth asking again: if we’re getting nearer some kind of optimised nirvana, how come everything seems so joyless and glum?
{{Singularity and perspective chauvinism}}


Since data quantity and computing horsepower have exploded in the last few decades, the [[high-modernist]]s have grown ever surer that their time — the [[Singularity]] — is nigh. Before long, and everything will be solved.
And if so, it’s worth asking again, ''how come everything seems so joyless and glum''? Are we ''missing'' something?
 
That would explain a curious dissonance: these modernising techniques arrive and flourish, while traditional modes of working requiring skill, craftsmanship and tact are outsourced, computerised, right-sized and AI-enhanced — but yet the end product gets no less cumbersome, no faster, no leaner, and no less risky.  ''[[Tedium]] remains constant''.<ref>This may be a [[sixteenth law of worker entropy]].</ref>
 
There may be fewer [[subject matter expert]]s around, but there seem to be more [[software-as-a-service]] providers, [[Master of Business Administration|MBA]]s,  [[COO]]s, [[workstream lead]]s and [[Proverbial school-leaver from Bucharest|itinerant school-leavers in call-centres on the outskirts of Brașov]].<ref>We present therefore the [[JC]]’s [[sixteenth law of worker entropy]] — the [[law of conservation of tedium]].</ref>


But, a curious dissonance: these modernising techniques arrive and flourish, while traditional modes of working requiring skill, craftsmanship and tact are outsourced, computerised, right-sized and AI-enhanced — but yet the end product gets no less cumbersome, no faster, no leaner, and no less risky. There may be fewer [[subject matter expert]]s around, but there seem to be more [[software-as-a-service]] providers, [[Master of Business Administration|MBA]]s,  [[COO]]s, [[workstream lead]]s and [[Proverbial school-leaver from Bucharest|itinerant school-leavers in call-centres on the outskirts of Brașov]]
====Taylorism====
====Taylorism====
Done of this is new: just our enthusiasm for it. The prophet of [[data modernism]] was [[Frederick Winslow Taylor]], progenitor of the maximally efficient production line. His inheritors say things like, “[[The Singularity is Near|the singularity is near]]” and “[[Software is eating the world|software will eat the world]]” but for all their millenarianism the on-the-ground experience at the business end of this all world-eating software is as grim as it ever was.
None of this is new: just our enthusiasm for it. The prophet of [[data modernism]] was [[Frederick Winslow Taylor]], progenitor of the maximally efficient production line. His inheritors say things like, “[[The Singularity is Near|the singularity is near]]” and “[[Software is eating the world|software will eat the world]]” but for all their millenarianism the on-the-ground experience at the business end of this all world-eating software is as grim as it ever was.
====Time====
 
We have a theory that in reducing everything to measured inputs and outputs, [[data modernism]] collapses into a kind of ''[[reductionism]]'', only about ''time'': just as reductionists see our knowledge of the universe as being reducible to infinitesimally small sub-atomic essences — so a function of theoretical physics — so do data modernists see all of commerce as explicable in terms of infinitesimally small windows of ''time'' so thin that they are static. Let’s call these windows “frames”, resembling as they do individual frames in a movie reel. The beauty of static frames is, not being in motion, they can’t do anything unexpected. Yet, if you run a sequence of consecutive frames close to one another they ''appear'' to move, in the same way that still movie frames do. In this way does data modernism replace  the ''actual'' passage of time with the appearance of passing time.
==2. Time==
====Reductionism about time====
The JC’s developing view is that this grimness is caused by the poverty of this model when compared to the territory it sets out to map. For the magic of an algorithm is its ability to reduce a rich, multi-dimensional experience to a succession of very simple, one-dimensional steps. But in that [[reductionism]], we lose something ''essential''.
 
The Turing machines on which [[data modernism]] depends have no ''[[tense]]''. There is no past or future, perfect or otherwise in code: there is only a permanent simple ''present''. A software object’s ''past'' is rendered as a series of date-stamped events and presented as metadata in the present. An object’s ''future'' is not represented at all.
 
In reducing everything to data, spatio-temporal continuity is represented as an array of contiguous, static ''events''. Each has zero duration and zero dimension: they are just ''values''.
 
Now the beauty of a static frame is its economy. It ''can’t move'', it can’t surprise us, it takes up minimal space. We can replace bandwidth-heavy actual spacetime, in which ''three'' dimensional objects project backwards and forwards in a ''fourth'' dimension — with ''apparent'' time, rendered the single symbol-processing dimension that is the lot of all Turing machines.
 
The apparent temporal continuity that results, like cinematography, is a conjuring trick: it does not exist “in the code” at all; rather the output of the code is presented in a way that ''induces the viewer to impute continuity to it''. When regarding the code’s output, the user ascribes her own conceptualisation of time, from her own natural language, to what she sees. The “magic” is not in the machine. It is in her head.
 
For existential continuity backwards and forwards in “time”, is precisely the problem the human brain evolved to solve: it demands a projection of continuously existing “things” with definitive boundaries, just one of which is “me”, moving through spacetime, interacting with each other. None of this “continuity” is “in the data”.<ref>{{author|David Hume}} wrestled with this idea of continuity: if I see you, then look away, then look back at you, what ''grounds'' do I have for believing it is still “you”?  Computer code makes no such assumption. It captures property A, timestamp 1; property A timestamp 2, property A timestamp 3: these are discrete objects with common property, in a permanent present — code imputes no necessary link between them, not does it extrapolate intermediate states. It is the human genius to make that logical leap. How we do it, ''when'' we do it — generally, how human consciousness works, defies explanation. {{author|Daniel Dennett}} made a virtuoso attempt to apply this algorithmic [[reductionist]] approach to the problem of mind in {{br|Consciousness Explained}}, but ended up defining away the very thing he claimed to explain, effectively concluding “consciousness is an illusion”. But on whom?</ref>
 
Turing machines, and [[data modernism]] that depends on them, does away with the need for time and continuity altogether, instead ''simulating'' it through a succession of static slices — but that continuity vanishes when one regards the picture show as a sequence of still frames.
 
But existential continuity is not the sort of problem you can define away. Dealing with history and continuity is exactly the thing we are trying to solve.
 
[[Gerd Gigerenzer]] has a nice example that illustrates the importance of continuity.  


Data modernism has no concept of time at all: the computer languages in which it is written don’t do [[tense|''tense'']]: they are coded in the present, and have no frame of reference for continuity.  
Imagine a still frame of two pint glasses, A and B, each containing half a pint of beer. Which is half-full and which is half-empty? This static scenario poses an apparently stupid question. It is often used to illustrate how illogical and imprecise our language is. But this is only true if the still frame is considered in the abstract: that is, stripped of its ''context'' in time and space.  


But 'existential continuity backwards and forwards in “time” is precisely the problem that the human brain solves: this is the thing that demands continuously existing “things”, just one of which is “me”, moving through a spatio-temporal universe, interacting with each other and hence requiring definitive boundaries.<ref>{{author|David Hume}} wrestled with this idea of continuity: if I see you, then look away, then look back at you, what ''grounds'' do I have for believing it is still “you”? Computer code makes no such assumption. It is the human genius to make that logical leap. How we do it, and how consciousness works, defies explanation. {{author|Daniel Dennett}} made a virtuoso attempt to apply this algorithmic [[reductionist]] approach to the problem of mind in {{br|Consciousness Explained}}, but ended up defining away the very thing he claimed to explain, effectively concluding “consciouness is an illusion”. But on whom?</ref>
For, imagine a short film at the start of which glass A is full and glass is B empty. Then a little Cartesian imp arrives, picks up glass A and tips half into glass B. ''Now'' which is half-full and which is half-empty? ''That history makes a difference''.  


[[Data modernism]] thereby does away with the need for time and continuity altogether, but rather ''simulates'' it through a succession of static slices — but continuity vanishes when one regards the picture show as a sequence of frames.  
The second time-bound scenario tells us something small, but meaningful about the history of the world. The snapshot does not.


But dealing with history is exactly the challenge.
====Assembling a risk period out of snapshots====
The object of the exercise is to have as fair a picture of the real risk of the situation with as little information processing as possible. Risks play out over a timeframe: the trick is to gauge what that is.


Gerd Gigerenzer has a nice example that illustrates the importance of continuity.  
Here is the appeal of [[data modernism]]: you can assemble the ''appearance'' of temporal continuity — the calculations for which are ''gargantuan'' — out of a series of data snapshots, the calculations for which are merely ''huge''.  


Imagine a still frame of two pint glasses, A and B, each containing half a pint of beer.<Ref>One that costs more than a fortnight’s subscription to the JC, by the way.</ref> Which is half-full and which is half-empty?
Information processing capacity being what it is — still limited, approaching an asymptote and increasingly energy consumptive (rather like an object approaching the speed of light) — there is still a virtue in economy. Huge beats gargantuan. In any case, the shorter the window of time we must represent to get that fair picture of the risk situation, the better. We tend to err on the short side.


Now, imagine a short film in which glass A is full and glass B empty, then a little Cartesian homunculus tips half of the contents of glass A into glass B. ''Now'' which is half-full and which is half-empty?
For example the listed corporate’s quarterly reporting period: commentators lament how it prioritises unsustainable short term profits over long term corporate health and stability — it does — while modernists, and those resigned to it, shrug their shoulders with varying degrees of regret, shake their heads and say that is just how it is.


The first scenario seems to pose a stupid question. The second scenario tells us something small  about the history of the world. To capture that information using code is possible, sure, but it is extremely complicated.
For our purposes, another short period that risk managers look to is the ''liquidity period'': the longest plausible time one is stuck with an investment before one can get out of it. The period of risk. This is the time frame over which one measures — guestimates — one’s maximum potential unavoidable loss.  


And it is partly because having to cope with history, the passage of time, and the continued existence of objects, makes things exponentially more complex than they already are. An atomically thin snapshot of the world as data is enough of a beast to be still well beyond the operating parameters of even the most powerful quantum machines: that level of detail extending into the future and back from the past is, literally, infinitely more complicated. The modernist programme is to suppose that “time” is really just comprised of billions of infinitesimally thin, static slices, each functionally identical to any other, so by measuring the [[delta]] between them we have a means of handling that complexity.
Liquidity differs by asset class. Liquidity for equities is usually almost — not quite, and this is important — instant. Risk managers generally treat it “a day or so”.


In any case, just in time rationalisers take a cycle and code for that. What is the process, start to finish, what are the dependencies, what are the plausible unknowns, and how do we optimise for efficiency of movement, components and materials, to manage
For an investment fund it might be a day, a month, a quarter, or a year. Private equity might be 5 years. For real estate it is realistically months, but in any case indeterminate. Probabilistically you are highly unlikely to lose the lot in a day, but over five years there is a real chance.
 
So generally the more liquid the asset, the more controllable is its risk.  But liquidity, like volatility, comes and goes. It is usually not there when you most need it.  So we should err on the long side when estimating liquidity periods in a time of stress.
 
But there the longer the period, the greater that change of loss. And the harder things are to calculate. We are doubly motivated to keep liquidity periods as short as as possible.
 
====[[Glass half full]] and multidimensionality====
Here is where history — ''real'' history, not the synthetic history afforded by [[data modernism]] — makes a difference.
 
''On a day'', the realistic range in which a stock can move in a liquidity period — its “gap risk” — is relatively stable. Say, 30 percent of its [[market value]].  (This [[market value]] we derive from technical and fundamental readings: the business ’s book value, the presumption that there is a sensible [[bid]] and [[ask]], so that the stock price will oscillate around its “true value” as bulls and bears cancel each other out under the magical swoon of [[Adam Smith]]’s [[invisible hand]].
 
But this view is assembled from static snapshots which don't move at all. Each frame carries ''no'' intrinsic risk: the ''illusion'' of movement emerges from the succession of frames. Therefore [[data modernism]] is not good at estimating how long a risk period should be. Each of its snapshots, when you zero in on it, is a still life: here, shorn of its history, a “[[glass half full]]” and a “[[glass half empty]]” look alike.
 
We apply the our risk tools to them as if they were the same: ''assuming the market value is fair, how much could I lose in the time it would realistically take me to sell''?  Thirty percent, right?
 
But they are ''not'' the same.
 
If a stock trades at 200 today, it makes a difference that it traded at 100 yesterday, 50 the day before that, and over the last ten years traded within a range between 25 and 35. This history tells us this glass, right now, is massively, catastrophically over-full: that the milk in it is, somehow, freakishly forming an improbable spontaneous column above the glass, restrained and supported by nothing by the laws of extreme improbability, and it is liable to revert to its Brownian state at any moment with milk spilt ''everywhere''.
 
With that history might think a drop of 30pc of the milk is our ''best'' case scenario.


=== It’s the long run, stupid===
=== It’s the long run, stupid===
Line 78: Line 128:
Then, your time horizon for redundancy is not one year, or twenty years, but ''two-hundred and fifty years''. Quarter of a millennium: that is how long it would take to earn back $5 billion in twenty million dollar clips.
Then, your time horizon for redundancy is not one year, or twenty years, but ''two-hundred and fifty years''. Quarter of a millennium: that is how long it would take to earn back $5 billion in twenty million dollar clips.


===On the virtue of slack===
==3. Redundancy==
====On the virtue of slack====
Redundancy is another word for “slack”, in the sense of “looseness in the tether between interconnected parts of a wider whole”.  
Redundancy is another word for “slack”, in the sense of “looseness in the tether between interconnected parts of a wider whole”.  


Line 102: Line 153:


To be sure, the importance of employees, and the value they add, is not constant. We all have flat days where we don’t achieve very much. In an operationalised workplace they pick up a penny a day on 99 days out of 100; if they save the firm £ on that 100th day, it is worth paying them 2 pennies a day every day even if, 99 days out of 100, you are making a loss.
To be sure, the importance of employees, and the value they add, is not constant. We all have flat days where we don’t achieve very much. In an operationalised workplace they pick up a penny a day on 99 days out of 100; if they save the firm £ on that 100th day, it is worth paying them 2 pennies a day every day even if, 99 days out of 100, you are making a loss.
===Fragility and tight coupling===
===Fragility and tight coupling===
The “leaner” a distributed system is, the more ''fragile'' it will be and the more “single points of failure” it will contain, whose malfunction in the best case, will halt the whole system, and in [[tightly-coupled]] [[complex system]]s may trigger further component failures, chain reactions and unpredictable nonlinear consequences.
The “leaner” a distributed system is, the more ''[[fragile]]'' it will be and the more “[[single points of failure]]” it will contain whose malfunction, in the best case, will halt the whole system, and in [[tightly-coupled]] [[complex system]]s may trigger a chain reaction of successive component failures, chain reactions and unpredictable nonlinear consequences. On 9/11, as Martin Amis put it, 20 box-cutters created two million tonnes of rubble, left 4,000 dead, and transformed global politics for a generation.


A financial market is a [[complex system]] comprising an indeterminate number of autonomous actors, many of whom (notably corporations) are themselves complex systems, interacting in unpredictable ways.  
A financial market is a [[complex system]]. It comprises an indeterminate number of autonomous actors, many of whom notably participating corporations are themselves complex systems, interacting in unpredictable ways. It is the nature of a complex system that it is unpredictable — not just in the sense of a jointed pendulum, whose exact path is impossible to predict, but whose possible range of movement, it's design space, is known to 100 percent. The “design space” of a complex system is unknown. You cannot calculate even probabilistically how it will behave. The fact that, for long periods, it appears to closely cleave to simple operating parameters is beside the point.  


The robustness of any system depends on the tightness of the coupling between components. How much slack is there? In financial markets, increasingly, none at all.  
The robustness of any system depends on the tightness of the coupling between components. How much slack is there? In financial markets, increasingly, none at all.