Complex system: Difference between revisions
Amwelladmin (talk | contribs) No edit summary |
Amwelladmin (talk | contribs) No edit summary |
||
Line 1: | Line 1: | ||
{{a|devil|{{subtable|{{complex capsule}}}}}} | {{a|devil|{{subtable|{{complex capsule}}}}}} | ||
===You can’t eliminate the risk, so focus on ''managing'' it=== | ===You can’t eliminate the risk, so focus on ''managing'' it=== | ||
A traditional risk manager will be conditioned to using control techniques to anticipate and eliminate all risk. In a [[complex system]] this is simply not possible. One must instead depend on local managers and experts making spontaneous decisions to address the unfortunate situation as they see it and under conditions of significant uncertainty. A [[complex system]] is not totally random — in that case any action would be as good as any other — so some control is possible, but it is ''not'' possible to prescribe in advance what that action should be. | |||
Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. <ref>I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?</ref> |
Revision as of 21:41, 3 August 2020
|
You can’t eliminate the risk, so focus on managing it
A traditional risk manager will be conditioned to using control techniques to anticipate and eliminate all risk. In a complex system this is simply not possible. One must instead depend on local managers and experts making spontaneous decisions to address the unfortunate situation as they see it and under conditions of significant uncertainty. A complex system is not totally random — in that case any action would be as good as any other — so some control is possible, but it is not possible to prescribe in advance what that action should be.
Frequently complicated system risk attenuators can, in fact, aggravate risk situations in complex systems. Alarms going off make it harder to hear; multiple alarms increase panic and obscure each other; an obligation to follow prescribed safety routines can impede quick and surgical response to traumatic situations. There are times, therefore, where you want to throw your checklist out the window. [1]
- ↑ I know, I know — try telling that to the chap who landed his plane on the Hudson thanks to his unflappable compliance with cockpit checklists, right?