Complex system
|
You can’t eliminate the risk, so focus on managing it
- Everybody has a plan until they get punched in the mouth.
- —Mike Tyson
A traditional risk manager will be conditioned to using control techniques to anticipate and eliminate all risk. In a complex system this is simply not possible. One must instead depend on local managers and experts making spontaneous decisions to address the unfortunate situation as they see it and under conditions of significant uncertainty. A complex system is not totally random — in that case any action would be as good as any other — so some control is possible, but it is not possible to prescribe in advance what that action should be.
Therefore plan, but not with an expected outcome in mind. Plan for the unexpected. Have band-aids, a Swiss Army knife, some duct tape and a towel with you. Try to imagine how things might unfold, and watch them as they do, adapting as you go.
- “When a man throws a ball high in the air and catches it again, he behaves as if he had solved a set of differential equations in predicting the trajectory of the ball. He may neither know nor care what a differential equation is, but this does not affect his skill with the ball. At some subconscious level, something functionally equivalent to the mathematical calculations is going on.”
- — Richard Dawkins with one of his “yeah, well, not quite, Dickie” moments. He has had his fair share of those over the years.
You cannot brute force compute a wicked problem, like dynamically catching a ball, but you can still catch a ball: don’t think, “punch all the variables into a machine and run round to the resulting co-ordinate and stick your hand out.” You don’t have nearly enough information to even majke the calculation. Instead, just run towards the damn thing, watching it, adjusting as you go.[1]
This is hard for a complicated systems guy. Complicated systems you can brute force, and you can predict how they will behave. You can pre-bake solutions, making them more simple. In complex systems you can’t: need to keep your options open and be prepared to shift, adapt, re-evaluate, and toss out whatever you might have concluded before now. Philip Tetlock’s “Superforecasters” are complex systems thinkers. Baseball players are complex systems thinkers. Richard Dawkins, whom I like to imagine was dyspraxic,[2] is a complicated systems thinker.
If a complex system blows up, “complicated” risk management systems can get in the way
Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. [3]
References
- ↑ A study a while back found baseball players while excellent at catching moving balls, were bad at predicting where they would land if they had to stand still.
- ↑ largely because he was trying to solve differential equations instead of running after the ball, of course.
- ↑ I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?