Complex system: Difference between revisions

no edit summary
No edit summary
No edit summary
 
(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{a|devil|[[File:Pripyat.jpg|450px|thumb|center|A complex system that didn’t work out so well, yesterday.]]}}
{{a|systems|{{image|Pripyat|jpg|}}}}{{quote|
:''Everybody has a plan until they get punched in the mouth.''
''Everybody has a plan until they get punched in the mouth.''
::—Mike Tyson
:—Mike Tyson}}
===You can’t ''eliminate'' the risk, so focus on ''managing'' it===
===You can’t ''eliminate'' the risk, so focus on ''managing'' it===
A traditional risk manager — that is, one managing [[complicated system]]s and not [[complex system|complex]] ones<ref>Open question — a ''gaping'' open question, like when your goalie has come up for a corner — is ''why'' a traditional risk manager is managing what is undoubtedly a [[wicked environment]] using tools suitable for a [[tame environment|tame]] one. But it was ever thus: [[Black-Scholes option pricing model]], which is predicated on a normal distribution, can’t work with The [[Black swan|tail events]] and whose failure in those circumstances led directly to both the [[LTCM]] collapse and the [[Great Financial Crisis]], is still widely used today, after all.</ref> — will be conditioned to using control techniques to anticipate and eliminate all risk.  
A traditional risk manager — that is, one managing [[complicated system]]s and not [[complex system|complex]] ones<ref>Open question — a ''gaping'' open question, like when your goalie has come up for a corner — is ''why'' a traditional risk manager is managing what is undoubtedly a [[wicked environment]] using tools suitable for a [[tame environment|tame]] one. But it was ever thus: [[Black-Scholes option pricing model]], which is predicated on a normal distribution, can’t work with The [[Black swan|tail events]] and whose failure in those circumstances led directly to both the [[LTCM]] collapse and the [[Great Financial Crisis]], is still widely used today, after all.</ref> — will be conditioned to using control techniques to anticipate and eliminate all risk.  
Line 23: Line 23:
This is hard for a [[complicated system]]s guy. [[Complicated system]]s you can brute force, and you can predict how they will behave. You can pre-bake solutions, making them more simple. In [[complex system]]s you can’t: need to keep your options open and be prepared to shift, adapt, re-evaluate, and toss out whatever you might have concluded before now. {{author|Philip Tetlock}}’s “{{br|Superforecasters}}” are complex systems thinkers. Baseball players are complex systems thinkers. Richard Dawkins, whom I like to imagine was dyspraxic,<ref>largely because he was trying to solve differential equations instead of running after the ball, of course.</ref> is a [[complicated system]]s thinker.
This is hard for a [[complicated system]]s guy. [[Complicated system]]s you can brute force, and you can predict how they will behave. You can pre-bake solutions, making them more simple. In [[complex system]]s you can’t: need to keep your options open and be prepared to shift, adapt, re-evaluate, and toss out whatever you might have concluded before now. {{author|Philip Tetlock}}’s “{{br|Superforecasters}}” are complex systems thinkers. Baseball players are complex systems thinkers. Richard Dawkins, whom I like to imagine was dyspraxic,<ref>largely because he was trying to solve differential equations instead of running after the ball, of course.</ref> is a [[complicated system]]s thinker.


===If a [[complex system]] blows up, “[[complicated]]” risk management systems can get in the way===
===If a complex system blows up, “[[complicated]]” risk management systems can get in the way===
Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. <ref>I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?</ref>
Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. <ref>I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?</ref>


{{ref}}
{{ref}}