Complexity: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
No edit summary
No edit summary
 
(8 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{a|devil|{{subtable|{{complex capsule}}}}}}[[All other things being equal]], a bummer — if you have in mind ''[[risk]]'' — and a boon — if you have in mind ''reward''. A violation of [[Occam’s razor]]; a source of confusion, a time-sink, a material contributor to catastrophic [[normal accident]]s; a ''[[waste]]'' — yet in a [[distributed network]] of autonomous bodies, an ''inevitability''. The more sophisticated the group of individuals, the greater the rate of complexification.
{{a|systems|{{image|dancing landscape|jpg|A dancing landscape, yesterday.}}{{subtable|{{complex capsule}}}}}}[[All other things being equal]], a bummer — if you have in mind ''[[risk]]'' — and a boon — if you have in mind ''reward''. A violation of [[Occam’s razor]]; a source of confusion, a time-sink, a material contributor to catastrophic [[normal accident]]s; a ''[[waste]]'' — yet in a [[distributed network]] of autonomous bodies, an ''inevitability''. The more sophisticated the group of individuals, the greater the rate of complexification.
 
There are those [[Deterministic|determinists]] who see complexity as just a sort of presently-uncalculatable state of a simple algortithm, like [[Conway’s Game of Life]]. We do not agree.


====Complexity as a bummer====
====Complexity as a bummer====
Line 9: Line 11:
*T’Internet
*T’Internet
*The gradual disintermediation of [[information]] from its [[substrate]]
*The gradual disintermediation of [[information]] from its [[substrate]]
*[[Jacquard’s loom]]
*[[Jacquard loom]]


===[[Complicated]] versus [[complex]]===
===[[Complicated]] versus [[complex]]===
Line 23: Line 25:


{{complex capsule}}
{{complex capsule}}
{{sa}}
{{sa}}
*[[Conway’s Game of Life]]
*[[Normal distribution]]
*[[Barnacle]]s
*[[Barnacle]]s
*{{br|Normal Accidents: Living with High-Risk Technologies}} by the magnificent {{author|Charles Perrow}}.
*{{br|Normal Accidents: Living with High-Risk Technologies}} by the magnificent {{author|Charles Perrow}}.
{{ref}}
{{ref}}
<references />

Latest revision as of 12:31, 6 November 2022

The JC’s amateur guide to systems theory
A dancing landscape, yesterday.

Complex systems present as “wicked problems”. They are dynamic, unbounded, incomplete, contradictory and constantly changing. They comprise an indefinite set of subcomponents that interact with each other and the environment in unexpected, non-linear ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm can predict how they will behave in all circumstances. Probabilistic models may work passably well most of the time, but the times where statistical models fail may be exactly the times you really wish they didn’t, as Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So while you may manage the simple and complicated sub-systems effectively with algorithms, checklists, and playbooks — and may manage tthe system on normal times, you remain at risk to “tail events” in abnormal circumstances. You cannot eliminate this risk: accidents in complex systems are inevitable — hence “normal”, in Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable.

Index: Click to expand:
Tell me more
Sign up for our newsletter — or just get in touch: for ½ a weekly 🍺 you get to consult JC. Ask about it here.

All other things being equal, a bummer — if you have in mind risk — and a boon — if you have in mind reward. A violation of Occam’s razor; a source of confusion, a time-sink, a material contributor to catastrophic normal accidents; a waste — yet in a distributed network of autonomous bodies, an inevitability. The more sophisticated the group of individuals, the greater the rate of complexification.

There are those determinists who see complexity as just a sort of presently-uncalculatable state of a simple algortithm, like Conway’s Game of Life. We do not agree.

Complexity as a bummer

  • Chernobyl
  • Space Shuttle Challenger
  • Any financial calamity you care to mention

Complexity as a boon

Complicated versus complex

Things can be merely complicated without being complex. Complicated problems are naturally difficult, but you can solve them with rules and algorithms. The systems and processes, by which The Man commands and controls employees can manage this kind of complicatedness.

Algorithms, systems and processes don’t work for complex problems, however. Complex problems involve independent, autonomous agents interacting in un-anticipatable ways. No pre-defined rule-set can anticipate the interactions. Black swans, technology disruptions but also interlocking complicated systems (nuclear power plants, space shuttles, air-traffic control systems are complex.

Identify your systems

Is your system simple, complicated or complex?

Simple systems: simple systems are situations where essentially inanimate objects interact with each other in ways that are fully understood. Lego is a simple system. So is a cake recipe, or a bungee jump. The components of a simple system don’t fight back. Simple systems are therefore predictable. They can only go wrong if components fail or you don’t follow instructions. In either case they fail in predictable ways. As such, simple systems are suitable for checklists,[1] recipes etc, where algorithms can overcome the hubris that will surely rain down on the heads of those who treat simple processes as trivial. Disinfecting your instruments before performing heart surgery, for example, is a simple step to take, but not a trivial one.

Complicated systems require interaction with autonomous agents whose specific behaviour is beyond the observer’s control, and might be intended to defeat the observer’s objective, but whose range of behaviour is deterministic, rule-bound and known and can be predicted in advance, and where the observer’s observing behaviour does not itself interfere with the essential equilibrium of the system.

You know you have a complicated system when it cleaves to a comprehensive set of axioms and rules, and thus it is a matter of making sure that the proper models are being used for the situation at hand. Chess and Alpha Go are complicated, but not complex, systems. So are most sports. You can “force-solve” them, at least in theory.

Complicated systems benefit from skilled management and some expertise to operate: a good chess player will do better than a poor one, and clearly a skilled, fit footballer can execute a plan better than a wheezy novice — but in the right hands and given good instructions even a mediocre player can usually manage without catastrophe. While success will be partly a function of user’s skill and expertise, a bad player with a good plan may defeat a skilled player with a bad one.

Given enough processing power, complicated systems are predictable, determinative and calculable. They’re tame, not wicked problems.

Complex systems present as “wicked problems”. They are dynamic, unbounded, incomplete, contradictory and constantly changing. They comprise an indefinite set of subcomponents that interact with each other and the environment in unexpected, non-linear ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm can predict how they will behave in all circumstances. Probabilistic models may work passably well most of the time, but the times where statistical models fail may be exactly the times you really wish they didn’t, as Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So while you may manage the simple and complicated sub-systems effectively with algorithms, checklists, and playbooks — and may manage tthe system on normal times, you remain at risk to “tail events” in abnormal circumstances. You cannot eliminate this risk: accidents in complex systems are inevitable — hence “normal”, in Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable.

See also

References