Systemantics: The Systems Bible: Difference between revisions
Amwelladmin (talk | contribs) Created page with "{{a|book review|}}This book is a hoot. It isn’t a sober introduction to systems theory — for that, try {{author|Donella H. Meadows}}’ {{br|Thinking in Systems}} — but..." |
Amwelladmin (talk | contribs) No edit summary |
||
Line 1: | Line 1: | ||
{{a|book review|}}This book is a hoot. It isn’t a sober introduction to systems theory — for that, try {{author|Donella H. Meadows}}’ {{br|Thinking in Systems}} — but more a contrarian successor to those great pieces of 60s management wit {{br|The Peter Principle}} and {{Parkinson’s Law}}, analysing the Kafkaesque world not in terms of hierarchies but “systems we set up to accomplish some goal”, by dint of which a new entity comes into being: the system itself. | {{a|book review|}}This book is a hoot. It isn’t a sober introduction to systems theory — for that, try {{author|Donella H. Meadows}}’ {{br|Thinking in Systems}} — but more a contrarian successor to those great pieces of 60s management wit {{br|The Peter Principle}} and {{br|Parkinson’s Law}}, analysing the Kafkaesque world not in terms of hierarchies but “systems we set up to accomplish some goal”, by dint of which a new entity comes into being: the system itself. | ||
{{Quote|“Now the system itself has to be dealt with. Whereas before there was only the Problem—such as warfare between nations, or garbage collection—there is now an additional universe of problems associated with the functioning or merely the presence of the new system.”}} | {{Quote|“Now the system itself has to be dealt with. Whereas before there was only the Problem—such as warfare between nations, or garbage collection—there is now an additional universe of problems associated with the functioning or merely the presence of the new system.”}} | ||
Regular readers may not the similarity with the JC’s own coinage, the [[second-order derivative]], by which risk managers substitute monitoring the actual risk with monitoring a range of systemic or programmatic indicators of the presence of that risk — a system to keep the risk in check, which you then spend all your time keeping the system in check, when it might have been better all along to just keep an eye on the risk. | Regular readers may not the similarity with the JC’s own coinage, the [[second-order derivative]], by which risk managers substitute monitoring the actual risk with monitoring a range of systemic or programmatic indicators of the presence of that risk — a system to keep the risk in check, which you then spend all your time keeping the system in check, when it might have been better all along to just keep an eye on the risk. | ||
Line 11: | Line 11: | ||
*Systems are imposed to correct unexpected problems that have since been solved, they are “fully prepared for the past” | *Systems are imposed to correct unexpected problems that have since been solved, they are “fully prepared for the past” | ||
*Temporary patches are very likely to become permanent, and then structural | *Temporary patches are very likely to become permanent, and then structural | ||
{{sa}} | |||
*{{br|Thinking in Systems}} | |||
*[[Systems theory]] | |||
{{br|The Peter Principle}} |
Revision as of 19:23, 12 July 2021
|
This book is a hoot. It isn’t a sober introduction to systems theory — for that, try Donella H. Meadows’ Thinking in Systems — but more a contrarian successor to those great pieces of 60s management wit The Peter Principle and Parkinson’s Law, analysing the Kafkaesque world not in terms of hierarchies but “systems we set up to accomplish some goal”, by dint of which a new entity comes into being: the system itself.
“Now the system itself has to be dealt with. Whereas before there was only the Problem—such as warfare between nations, or garbage collection—there is now an additional universe of problems associated with the functioning or merely the presence of the new system.”
Regular readers may not the similarity with the JC’s own coinage, the second-order derivative, by which risk managers substitute monitoring the actual risk with monitoring a range of systemic or programmatic indicators of the presence of that risk — a system to keep the risk in check, which you then spend all your time keeping the system in check, when it might have been better all along to just keep an eye on the risk.
In any case, once that proposition is stated, everything flows from that.
- A system tends to oppose its own proper function:
- With the introduction of as system, the total number of problems facing the community do not change; they simply change in form and relative importance.
- Systems are persistent: once established they have their own life force and they encroach, expanding to fill the known universe, using the power of positive feedback.
- Complex systems exhibit unexpected behaviour.
- Le Chatelier’s principle: any system tends to set up conditions opposing further operation of the process — as to which see goals and objectives. Especially SMART ones.
- Systems are imposed to correct unexpected problems that have since been solved, they are “fully prepared for the past”
- Temporary patches are very likely to become permanent, and then structural