Complex system: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
m (Amwelladmin moved page Complex systems to Complex system)
No edit summary
Line 1: Line 1:
{{a|devil|{{subtable|{{complex capsule}}}}}}
{{a|devil|{{subtable|{{complex capsule}}}}}}
===You can’t eliminate the risk, so focus on ''managing'' it===
===You can’t eliminate the risk, so focus on ''managing'' it===
''Everybody has a plan until they get punched in the mouth.''
::—Mike Tyson
A traditional risk manager will be conditioned to using control techniques to anticipate and eliminate all risk. In a [[complex system]] this is simply not possible. One must instead depend on local managers and experts making spontaneous decisions to address the unfortunate situation as they see it and under conditions of significant uncertainty. A [[complex system]] is not totally random — in that case any action would be as good as any other — so some control is possible, but it is ''not'' possible to prescribe in advance what that action should be.
A traditional risk manager will be conditioned to using control techniques to anticipate and eliminate all risk. In a [[complex system]] this is simply not possible. One must instead depend on local managers and experts making spontaneous decisions to address the unfortunate situation as they see it and under conditions of significant uncertainty. A [[complex system]] is not totally random — in that case any action would be as good as any other — so some control is possible, but it is ''not'' possible to prescribe in advance what that action should be.


Therefore plan, but not with an expected outcome in mind. Plan ''for the unexpected''. Have band-aids, a Swiss Army knife, some duct tape and a towel with you. Try to imagine how things might unfold, and watch them as they do, adapting as you go. 
:''“When a {{sex|man}} throws a ball high in the air and catches it again, he behaves as if he had solved a set of differential equations in predicting the trajectory of the ball. He may neither know nor care what a differential equation is, but this does not affect his skill with the ball. ''At some subconscious level, something functionally equivalent to the mathematical calculations is going on''.”''
::— {{author|Richard Dawkins}} with one of his “yeah, well, not quite, Dickie” moments. He has had his fair share of those over the years.
You ''cannot'' brute force compute a wicked problem, like dynamically catching a ball, ''but you can still catch a ball'': don’t think, “punch all the variables into a machine and run round to the resulting co-ordinate and stick your hand out.”  You don’t have nearly enough information to even majke the calculation. Instead, just run towards the damn thing, watching it, adjusting as you go.<ref>A study a while back found baseball players while excellent at catching moving balls, were bad at predicting where they would land if they had to stand still.</ref>
This is hard for a [[complicated system]]s guy. [[Complicated system]]s you can brute force, and you can predict how they will behave. You can pre-bake solutions, making them more simple. In [[complex system]]s you can’t: need to keep your options open and be prepared to shift, adapt, re-evaluate, and toss out whatever you might have concluded before now. {{author|Philip Tetlock}}’s “{{br|Superforecasters}}” are complex systems thinkers. Baseball players are complex systems thinkers. Richard Dawkins, whom I like to imagine was dyspraxic,<ref>largely because he was trying to solve differential equations instead of running after the ball, of course.</ref> is a [[complicated system]]s thinker.
===If a complex system blows up, “complicated” risk management systems can get in the way===
Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. <ref>I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?</ref>
Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. <ref>I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?</ref>
{{ref}}
{{ref}}

Revision as of 17:50, 4 August 2020

Complex systems present as “wicked problems”. They are dynamic, unbounded, incomplete, contradictory and constantly changing. They comprise an indefinite set of subcomponents that interact with each other and the environment in unexpected, non-linear ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm can predict how they will behave in all circumstances. Probabilistic models may work passably well most of the time, but the times where statistical models fail may be exactly the times you really wish they didn’t, as Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So while you may manage the simple and complicated sub-systems effectively with algorithms, checklists, and playbooks — and may manage tthe system on normal times, you remain at risk to “tail events” in abnormal circumstances. You cannot eliminate this risk: accidents in complex systems are inevitable — hence “normal”, in Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable.

In which the curmudgeonly old sod puts the world to rights.
Index — Click ᐅ to expand:

Comments? Questions? Suggestions? Requests? Insults? We’d love to 📧 hear from you.
Sign up for our newsletter.


You can’t eliminate the risk, so focus on managing it

Everybody has a plan until they get punched in the mouth.

—Mike Tyson

A traditional risk manager will be conditioned to using control techniques to anticipate and eliminate all risk. In a complex system this is simply not possible. One must instead depend on local managers and experts making spontaneous decisions to address the unfortunate situation as they see it and under conditions of significant uncertainty. A complex system is not totally random — in that case any action would be as good as any other — so some control is possible, but it is not possible to prescribe in advance what that action should be.

Therefore plan, but not with an expected outcome in mind. Plan for the unexpected. Have band-aids, a Swiss Army knife, some duct tape and a towel with you. Try to imagine how things might unfold, and watch them as they do, adapting as you go.

“When a man throws a ball high in the air and catches it again, he behaves as if he had solved a set of differential equations in predicting the trajectory of the ball. He may neither know nor care what a differential equation is, but this does not affect his skill with the ball. At some subconscious level, something functionally equivalent to the mathematical calculations is going on.”
Richard Dawkins with one of his “yeah, well, not quite, Dickie” moments. He has had his fair share of those over the years.

You cannot brute force compute a wicked problem, like dynamically catching a ball, but you can still catch a ball: don’t think, “punch all the variables into a machine and run round to the resulting co-ordinate and stick your hand out.” You don’t have nearly enough information to even majke the calculation. Instead, just run towards the damn thing, watching it, adjusting as you go.[1]

This is hard for a complicated systems guy. Complicated systems you can brute force, and you can predict how they will behave. You can pre-bake solutions, making them more simple. In complex systems you can’t: need to keep your options open and be prepared to shift, adapt, re-evaluate, and toss out whatever you might have concluded before now. Philip Tetlock’s “Superforecasters” are complex systems thinkers. Baseball players are complex systems thinkers. Richard Dawkins, whom I like to imagine was dyspraxic,[2] is a complicated systems thinker.

If a complex system blows up, “complicated” risk management systems can get in the way

Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. [3]


References

  1. A study a while back found baseball players while excellent at catching moving balls, were bad at predicting where they would land if they had to stand still.
  2. largely because he was trying to solve differential equations instead of running after the ball, of course.
  3. I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?