Complexity

From The Jolly Contrarian
(Redirected from Complexity - Risk Article)
Jump to navigation Jump to search
The Devil’s Advocate

Complex systems: Complex systems — also known as “wicked problems” — are dynamic, constantly changing, unbounded, incomplete, contradictory, and conflicting systems consisting of an indefinite set of subcomponents that interact with each other and the environment in unexpected ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm, heuristic or solution can predict how complex systems will behave in detail. You may apply probabilistic models to them and these will work passably well most of the time, but the times where it won’t — the extreme cases — will be exactly the times you really wish it would, as the founders of Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So even while you may manage the simple and complicated sub-systems effectively — deploy checklists, simplify, homogenise — and this may limit the total damage a tail event may cause, but you cannot eliminate it. Accidents in complex systems are inevitable — hence “normal”, in the Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable. It will do unexpected things. Like blowing up. So have your plans for dealing with those normal accidents.

In which the curmudgeonly old sod puts the world to rights.

Index — Click ᐅ to expand:

Get in touch
Comments? Questions? Suggestions? Requests? Sign up for our newsletter? Questions? We’d love to hear from you.
BREAKING: Get the new weekly newsletter here Old editions here

All other things being equal, a bummer — if you have in mind risk — and a boon — if you have in mind reward. A violation of Occam’s razor; a source of confusion, a time-sink, a material contributor to catastrophic normal accidents; a waste — yet in a distributed network of autonomous bodies, an inevitability. The more sophisticated the group of individuals, the greater the rate of complexification.

Complexity as a bummer

  • Chernobyl
  • Space Shuttle Challenger
  • Any financial calamity you care to mention

Complexity as a boon

Complicated versus complex

Things can be merely complicated without being complex. Complicated problems are naturally difficult, but you can solve them with rules and algorithms. The systems and processes, by which The Man commands and controls employees can manage this kind of complicatedness.

Algorithms, systems and processes don’t work for complex problems, however. Complex problems involve independent, autonomous agents interacting in un-anticipatable ways. No pre-defined rule-set can anticipate the interactions. Black swans, technology disruptions but also interlocking complicated systems (nuclear power plants, space shuttles, air-traffic control systems are complex.

Identify your systems

Is your system simple, complicated or complex?

Simple systems: simple systems are situations where essentially inanimate objects interact with each other in ways that are fully understood. Lego is a simple system. So is a cake recipe, or a bungee jump. The components of a simple system don’t fight back. Simple systems are therefore predictable. They can only go wrong if components fail or you don’t follow instructions. In either case they fail in predictable ways. As such, simple systems are suitable for checklists,[1] recipes etc, where algorithms can overcome the hubris that will surely rain down on the heads of those who treat simple processes as trivial. Disinfecting your instruments before performing heart surgery, for example, is a simple step to take, but not a trivial one.

Complicated systems: Unlike simple systems, complicated systems require interaction with autonomous agents whose specific behaviour is beyond the user's control, and might be intended to defeat the user’s objective, but whose range of behaviour is entirely deterministic. Each autonomous agent’s range of possible actions and reactions can be predicted in advance. At least, in theory.

For example chess — or, for that matter, any boardgame or sport.

Complicated systems therefore benefit from skilled management and some expertise to operate: a good chess player will do better than a poor one — a school-leaver from Bucharest with plenty of coffee and a playbook on her lap probably isn’t the droid you’re looking for — but in the right hands can usually be managed without catastrophe, though the degree of success will be a function of user’s skill and expertise.

You know you have a complicated system when it cleaves to a comprehensive set of axioms and rules, and thus it is a matter of making sure that the proper models are being used for the situation at hand. Chess and Alpha Go are complicated, but not complex, systems. You can “force-solve” them, at least in theory.[2] They are entirely predictable, determinative and calculable, given enough processing power. They’re tame, not wicked problems.

Complex systems: Complex systems — also known as “wicked problems” — are dynamic, constantly changing, unbounded, incomplete, contradictory, and conflicting systems consisting of an indefinite set of subcomponents that interact with each other and the environment in unexpected ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm, heuristic or solution can predict how complex systems will behave in detail. You may apply probabilistic models to them and these will work passably well most of the time, but the times where it won’t — the extreme cases — will be exactly the times you really wish it would, as the founders of Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So even while you may manage the simple and complicated sub-systems effectively — deploy checklists, simplify, homogenise — and this may limit the total damage a tail event may cause, but you cannot eliminate it. Accidents in complex systems are inevitable — hence “normal”, in the Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable. It will do unexpected things. Like blowing up. So have your plans for dealing with those normal accidents.

See also

References

  1. See: The Checklist Manifesto.
  2. Do you hear that, Daniel Susskind?