Systems theory: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
Created page with "{{a|devil|}}Organisational systems can be simple, complicated or complex. Best you know which one yours is. *{{sim..."
 
No edit summary
 
(17 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{a|devil|}}Organisational systems can be [[Simple system|simple]], [[Complicated system|complicated]] or [[Complex systems|complex]]. Best you know which one yours is.
{{a|systems|}}{{quote|''If you had a population that were miserable and restless because they had nowhere bearable to live, the preferred solution seemed not to be spending money on improving their conditions, but on hiring more police in case things turned ugly.''
*{{simple capsule}}
 
*{{complicated capsule}}
{{author|Alan Moore}}, {{br|Jerusalem}}}}[[Systems theory]] eschews the reductionist, deterministic, “[[normal science|scientific]]” disposition and views the world in terms of inter-operating systems. That is to say, it treats the ordinary interactions of life as [[complex]] and not merely [[complicated]] problems to solve; as interactions of and between systems. System interactions are necessarily complex in that they are not finite, they are [[non-linear]], and the rules of engagement nor information about the system are neither complete, coherent nor static.
*{{complex capsule}}
 
Systems are comprised of stocks, flows, and feedback loops. Good primer is {{author|Donella H. Meadows}}’ {{br|Thinking in Systems}}.
 
===Complexity Theory===
Organisational systems can be [[Simple system|simple]], [[Complicated system|complicated]] or [[Complex systems|complex]]. Best you know which one yours is. Differentiate between the ''type of system'' and ''how to manage that system''. So:
{{small|80}}
{| class="wikitable"
! {{header left}} System !! {{header left}} System characteristics !! {{header left}} How to manage !! {{header left}} Example
|- style="vertical-align: top;"
| '''[[Simple system]]'''|| Static process. Requires a series of steps. No interaction. Little if/then logic. Fixable (and best handled) by [[algorithm]]. ||  ''Unskilled'' application of algorithm. Suitable for [[Chatbot|machine]] production. || Bake a cake.
|- style="vertical-align: top;"
| '''[[Complicated system]]'''|| Bounded interactive processes: they involve interaction with autonomous agents but within fixed boundaries and according to preconfigured, known and static rules of engagement. All relevant information is ''available to'', even if not necessarily ''known by'', all participants in the system. || ''Skilled'' application of algorithm. Suitable for autonomous operation by [[subject matter expert]]. || [[Chess]] or [[Go]]. Football. Any zero-sum game.
|- style="vertical-align: top;"
| '''[[Complex system]]'''|| Unbounded, interactive process. Involves interaction with automonous agents without boundaries, without pre-agreed rules, and where information is limited and asymmetric. Rules, boundaries and each participant’s objectives are dynamic and change interactively. Impossible to predict. || Requires expertise, experience, autonomy, imaginative adaptability, [[heuristics]], and the ability to make, adjust and reject provisional conclusions as information changes. || Financial market. Manufacturing process. Air traffic control system.
|}
</div>
 
===[[Simple system]]s===
{{simple capsule}}
====Examples====
Common [[simple system]]s and what happens when they go wrong:
*A cake recipe: cake doesn’t rise.
*(theoretically) A negotiation [[playbook]], with comprehensive escalation procedures (it’s really a disguised [[complicated system]])
===[[Complicated system]]s===
{{complicated capsule}}
====Examples====
Common [[complicated system]]s and what happens when they go wrong:
*Music performance: flying rotten cabbages; no encore.
*Chess, Go, Poker, Bridge: other guy wins.
===[[Complex system]]s===
{{complex capsule}}
====Examples====
Common [[complex system]]s and what happens when they go wrong:
*'''Nuclear power plant''': Chernobyl, Three Mile Island and Fukushima
*'''The environment''' (and any naturally selecting ecosystem, really): Covid-19; global warming
*'''Air traffic control system''': Name your air crash but two classics are Air New Zealand’s Mount Erebus disaster and ValuJet 592 [https://www.theatlantic.com/magazine/archive/1998/03/the-lessons-of-valujet-592/306534/ This is a ''fantastic'' article about the latter].
*'''Financial markets''': Take your pick: [[LTCM]], [[Enron]], [[Global Financial Crisis]]
*The world wide web:
 
{{sa}}
*[[Complexity]]
{{ref}}

Latest revision as of 17:11, 22 September 2024

The JC’s amateur guide to systems theory
Index: Click to expand:
Tell me more
Sign up for our newsletter — or just get in touch: for ½ a weekly 🍺 you get to consult JC. Ask about it here.

If you had a population that were miserable and restless because they had nowhere bearable to live, the preferred solution seemed not to be spending money on improving their conditions, but on hiring more police in case things turned ugly.

Alan Moore, Jerusalem

Systems theory eschews the reductionist, deterministic, “scientific” disposition and views the world in terms of inter-operating systems. That is to say, it treats the ordinary interactions of life as complex and not merely complicated problems to solve; as interactions of and between systems. System interactions are necessarily complex in that they are not finite, they are non-linear, and the rules of engagement nor information about the system are neither complete, coherent nor static.

Systems are comprised of stocks, flows, and feedback loops. Good primer is Donella H. MeadowsThinking in Systems.

Complexity Theory

Organisational systems can be simple, complicated or complex. Best you know which one yours is. Differentiate between the type of system and how to manage that system. So:

System System characteristics How to manage Example
Simple system Static process. Requires a series of steps. No interaction. Little if/then logic. Fixable (and best handled) by algorithm. Unskilled application of algorithm. Suitable for machine production. Bake a cake.
Complicated system Bounded interactive processes: they involve interaction with autonomous agents but within fixed boundaries and according to preconfigured, known and static rules of engagement. All relevant information is available to, even if not necessarily known by, all participants in the system. Skilled application of algorithm. Suitable for autonomous operation by subject matter expert. Chess or Go. Football. Any zero-sum game.
Complex system Unbounded, interactive process. Involves interaction with automonous agents without boundaries, without pre-agreed rules, and where information is limited and asymmetric. Rules, boundaries and each participant’s objectives are dynamic and change interactively. Impossible to predict. Requires expertise, experience, autonomy, imaginative adaptability, heuristics, and the ability to make, adjust and reject provisional conclusions as information changes. Financial market. Manufacturing process. Air traffic control system.

Simple systems

Simple systems: simple systems are situations where essentially inanimate objects interact with each other in ways that are fully understood. Lego is a simple system. So is a cake recipe, or a bungee jump. The components of a simple system don’t fight back. Simple systems are therefore predictable. They can only go wrong if components fail or you don’t follow instructions. In either case they fail in predictable ways. As such, simple systems are suitable for checklists,[1] recipes etc, where algorithms can overcome the hubris that will surely rain down on the heads of those who treat simple processes as trivial. Disinfecting your instruments before performing heart surgery, for example, is a simple step to take, but not a trivial one.

Examples

Common simple systems and what happens when they go wrong:

  • A cake recipe: cake doesn’t rise.
  • (theoretically) A negotiation playbook, with comprehensive escalation procedures (it’s really a disguised complicated system)

Complicated systems

Complicated systems require interaction with autonomous agents whose specific behaviour is beyond the observer’s control, and might be intended to defeat the observer’s objective, but whose range of behaviour is deterministic, rule-bound and known and can be predicted in advance, and where the observer’s observing behaviour does not itself interfere with the essential equilibrium of the system.

You know you have a complicated system when it cleaves to a comprehensive set of axioms and rules, and thus it is a matter of making sure that the proper models are being used for the situation at hand. Chess and Alpha Go are complicated, but not complex, systems. So are most sports. You can “force-solve” them, at least in theory.

Complicated systems benefit from skilled management and some expertise to operate: a good chess player will do better than a poor one, and clearly a skilled, fit footballer can execute a plan better than a wheezy novice — but in the right hands and given good instructions even a mediocre player can usually manage without catastrophe. While success will be partly a function of user’s skill and expertise, a bad player with a good plan may defeat a skilled player with a bad one.

Given enough processing power, complicated systems are predictable, determinative and calculable. They’re tame, not wicked problems.

Examples

Common complicated systems and what happens when they go wrong:

  • Music performance: flying rotten cabbages; no encore.
  • Chess, Go, Poker, Bridge: other guy wins.

Complex systems

Complex systems present as “wicked problems”. They are dynamic, unbounded, incomplete, contradictory and constantly changing. They comprise an indefinite set of subcomponents that interact with each other and the environment in unexpected, non-linear ways. They are thus unpredictable, chaotic and “insoluble” — no algorithm can predict how they will behave in all circumstances. Probabilistic models may work passably well most of the time, but the times where statistical models fail may be exactly the times you really wish they didn’t, as Long Term Capital Management would tell you. Complex systems may comprise many other simple, complicated and indeed complex systems, but their interaction with each other will be a whole other thing. So while you may manage the simple and complicated sub-systems effectively with algorithms, checklists, and playbooks — and may manage tthe system on normal times, you remain at risk to “tail events” in abnormal circumstances. You cannot eliminate this risk: accidents in complex systems are inevitable — hence “normal”, in Charles Perrow’s argot. However well you manage a complex system it remains innately unpredictable.

Examples

Common complex systems and what happens when they go wrong:

  • Nuclear power plant: Chernobyl, Three Mile Island and Fukushima
  • The environment (and any naturally selecting ecosystem, really): Covid-19; global warming
  • Air traffic control system: Name your air crash but two classics are Air New Zealand’s Mount Erebus disaster and ValuJet 592 This is a fantastic article about the latter.
  • Financial markets: Take your pick: LTCM, Enron, Global Financial Crisis
  • The world wide web:

See also

References