Template:Complex capsule: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
Created page with "'''Complex systems''': Complex systems — also known as “wicked problems” — are dynamic, constantly changing, unbounded, incomplete, contradictory, and conflict..."
 
No edit summary
 
Line 1: Line 1:
'''[[Complex systems]]''': Complex systems — also known as “[[wicked problem]]s” are dynamic, constantly changing, unbounded, incomplete, contradictory, and conflicting systems consisting of an indefinite set of subcomponents that interact with each other and the environment in unexpected ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm, heuristic or solution can predict how complex systems will behave in detail. You may apply probabilistic models to them and these will work passably well ''most'' of the time, but the times where it won’t — the extreme cases — will be ''exactly'' the times you really wish it would, as the founders of [[Long Term Capital Management]] would tell you. Complex systems may comprise many other [[simple system|simple]], [[complicated system|complicated]] and indeed [[complex system]]s, but their interaction ''with each other'' will be a whole other thing. So even while you may manage the simple and complicated sub-systems effectively — deploy checklists, simplify, homogenise — and this may limit the total damage a tail event may cause, but you cannot eliminate it. Accidents in complex systems are ''inevitable'' — hence “[[Normal accident|normal]]”, in the {{author|Charles Perrow}}’s argot. However well you manage a complex system it remains ''innately'' unpredictable. It will do unexpected things. Like blowing up. So have your plans for dealing with those [[normal accident]]s.
[[Complex systems]] present as “[[wicked problem]]s”. They are dynamic, unbounded, incomplete, contradictory and constantly changing. They comprise an indefinite set of subcomponents that interact with each other and the environment in unexpected, [[non-linear]] ways. They are thus unpredictable, chaotic and “insoluble” — no [[algorithm]] can predict how they will behave in all circumstances. Probabilistic models may work passably well ''most'' of the time, but the times where statistical models fail may be ''exactly'' the times you really wish they didn’t, as [[Long Term Capital Management]] would tell you. Complex systems may comprise many other [[simple system|simple]], [[complicated system|complicated]] and indeed [[complex system]]s, but their interaction ''with each other'' will be a whole other thing. So while you may manage the [[simple]] and [[complicated]] sub-systems effectively with algorithms, checklists, and playbooks — and may manage tthe system on normal times, you remain at risk to “tail events” in abnormal circumstances. You cannot eliminate this risk: accidents in complex systems are ''inevitable'' — hence “[[Normal accident|normal]]”, in {{author|Charles Perrow}}’s argot. However well you manage a [[complex system]] it remains ''innately'' unpredictable.

Latest revision as of 11:33, 3 April 2022

Complex systems present as “wicked problems”. They are dynamic, unbounded, incomplete, contradictory and constantly changing. They comprise an indefinite set of subcomponents that interact with each other and the environment in unexpected, non-linear ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm can predict how they will behave in all circumstances. Probabilistic models may work passably well most of the time, but the times where statistical models fail may be exactly the times you really wish they didn’t, as Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So while you may manage the simple and complicated sub-systems effectively with algorithms, checklists, and playbooks — and may manage tthe system on normal times, you remain at risk to “tail events” in abnormal circumstances. You cannot eliminate this risk: accidents in complex systems are inevitable — hence “normal”, in Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable.