|The Devil’s Advocate™|
All other things being equal, a bummer — if you have in mind risk — and a boon — if you have in mind reward. A violation of Occam’s razor; a source of confusion, a time-sink, a material contributor to catastrophic normal accidents; a waste — yet in a distributed network of autonomous bodies, an inevitability. The more sophisticated the group of individuals, the greater the rate of complexification.
Complexity as a bummer
- Space Shuttle Challenger
- Any financial calamity you care to mention
Complexity as a boon
Complicated versus complex
Things can be merely complicated without being complex. Complicated problems are naturally difficult, but you can solve them with rules and algorithms. The systems and processes, by which The Man commands and controls employees can manage this kind of complicatedness.
Algorithms, systems and processes don’t work for complex problems, however. Complex problems involve independent, autonomous agents interacting in un-anticipatable ways. No pre-defined rule-set can anticipate the interactions. Black swans, technology disruptions but also interlocking complicated systems (nuclear power plants, space shuttles, air-traffic control systems are complex.
Identify your systems
Is your system simple, complicated or complex?
Simple systems: simple systems are situations where essentially inanimate objects interact with each other in ways that are fully understood. Lego is a simple system. So is a cake recipe, or a bungee jump. The components of a simple system don’t fight back. Simple systems are therefore predictable. They can
only go wrong if components fail or you don’t follow instructions. In either case they fail in predictable ways. As such, simple systems are suitable for checklists, recipes etc, where algorithms can overcome the hubris that will surely rain down on the heads of those who treat simple processes as trivial. Disinfecting your instruments before performing heart surgery, for example, is a simple step to take, but not a trivial one.
Complicated systems require interaction with autonomous agents whose specific behaviour is beyond the user’s control, and might be intended to defeat the user’s objective, but whose range of behaviour is deterministic, rule-bound and known and can therefore be predicted in advance. You know you have a complicated system when it cleaves to a comprehensive set of axioms and rules, and thus it is a matter of making sure that the proper models are being used for the situation at hand. Chess and Alpha Go are complicated, but not complex, systems. So are most sports. You can “force-solve” them, at least in theory.
Complicated systems benefit from skilled management and some expertise to operate: a good chess player will do better than a poor one, and clearly a skilled, fit footballer can execute a plan better than a wheezy novice — but in the right hands and given good instructions even a mediocre player can usually manage without catastrophe. While success will be partly a function of user’s skill and expertise, a bad player with a good plan may defeat a skilled player with a bad one.
Complex systems present as “wicked problems”. They are dynamic, unbounded, incomplete, contradictory and constantly changing. They comprise an indefinite set of subcomponents that interact with each other and the environment in unexpected, non-linear ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm can predict how they will behave in all circumstances. Probabilistic models may work passably well most of the time, but the times where statistical models fail may be exactly the times you really wish they didn’t, as Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So while you may manage the simple and complicated sub-systems effectively with algorithms, checklists, and playbooks — and may manage tthe system on normal times, you remain at risk to “tail events” in abnormal circumstances. You cannot eliminate this risk: accidents in complex systems are inevitable — hence “normal”, in Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable.
- Normal distribution
- Normal Accidents: Living with High-Risk Technologies by the magnificent Charles Perrow.
- See: The Checklist Manifesto.