Systemantics: The Systems Bible: Difference between revisions
Amwelladmin (talk | contribs) Created page with "{{a|book review|}}This book is a hoot. It isn’t a sober introduction to systems theory — for that, try {{author|Donella H. Meadows}}’ {{br|Thinking in Systems}} — but..." |
Amwelladmin (talk | contribs) No edit summary |
||
(10 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
{{a|book review|}} | {{a|book review|{{image|Systems Bible|jpg|}}}}{{bi}}{{drop|T|his book is}} a hoot. It isn’t a sober introduction to [[systems theory]] — for that, try {{author|Donella H. Meadows}}’ {{br|Thinking in Systems}} — but more a contrarian successor to those great pieces of 60s management wit {{br|The Peter Principle}} and {{br|Parkinson’s Law}}. But, as they say, take it seriously but not literally rather than literally but not seriously. | ||
{{author|John Gall}} analyses the Kafkaesque world of modern management not in terms of “hierarchies”, “power structures” or “paradigms”, but “systems we set up to accomplish some goal”, by dint of which a new entity comes into being: the system itself. And therein lies the problem. | |||
{{Quote|“Now the system itself has to be dealt with. Whereas before there was only the Problem—such as warfare between nations, or garbage collection—there is now an additional universe of problems associated with the functioning or merely the presence of the new system.”}} | {{Quote|“Now the system itself has to be dealt with. Whereas before there was only the Problem—such as warfare between nations, or garbage collection—there is now an additional universe of problems associated with the functioning or merely the presence of the new system.”}} | ||
In any case, once that proposition is stated, | Regular readers may notice a similarity to the JC’s favourite the [[second-order derivative]], by which risk managers substitute the ''actual'' risk with a range of systemic or programmatic ''indicators'' of the presence of that risk — a system to keep the risk under review, which then obliges you to spend all your time keeping the ''system'' in check, [[internal audit|auditing]] its [[key performance indicator]]s, it might have been better all along to spend your money on some [[subject matter expert|experts]] to keep an eye on the risk — rather than, you know, laying them off. | ||
*A system tends to oppose its own proper function | |||
*With the introduction of | In any case, once that proposition is stated, many great insights flows from it. | ||
*A system tends to oppose its own proper function. | |||
*With the introduction of a system, the total number of problems facing the community do not change; they simply change in form and relative importance. | |||
*Systems are persistent: once established they have their own life force and they encroach, expanding to fill the known universe, using the power of positive feedback. | *Systems are persistent: once established they have their own life force and they encroach, expanding to fill the known universe, using the power of positive feedback. | ||
*Complex systems exhibit unexpected behaviour. | *Complex systems exhibit unexpected behaviour. | ||
*Le Chatelier’s principle: any system tends to set up conditions opposing further operation of the process — as to which see [[goal]]s and objectives. Especially [[SMART]] ones. | *Le Chatelier’s principle: any system tends to set up conditions opposing further operation of the process — as to which see [[goal]]s and objectives. Especially [[SMART]] ones. | ||
*Systems are imposed to correct unexpected problems that have since been solved | *Systems are imposed to correct unexpected problems that have since been solved. They are “fully prepared for the past” | ||
*Temporary patches are very likely to become permanent, and then structural | *[[The temporary tends to become permanent|Temporary patches are very likely to become permanent]], and then structural. | ||
*The Naming Fallacy: the very act of naming throws everything into a frame of reference (This is something that Donald Trump has exploited well). | |||
Along the way there are wonderful wry vignettes, that might outrage those who look too LinkedIn for their wisdom and career inspiration: | |||
{{Quote|“We have already noted that a supermarket apple is not like the apple we had in mind, that what comes out of a coffee vending machine is not coffee as we once knew it, and that a person who takes a course in leadership training is acting out a behavioral pattern better described as Following rather than Leading.”}} | |||
{{quote|“Prolonged data-gathering is not uncommonly used as a means of not dealing with a problem. ... When so motivated, information-gathering represents a form of Passivity”.}} | |||
Sometimes, data and oversight gets in the way. | |||
{{sa}} | |||
*{{br|Thinking in Systems}} | |||
*[[The temporary tends to become permanent]] | |||
*[[Systems theory]] | |||
*{{br|The Peter Principle}} | |||
{{c|Systems theory}} |
Latest revision as of 16:42, 5 November 2024
|
This book is a hoot. It isn’t a sober introduction to systems theory — for that, try Donella H. Meadows’ Thinking in Systems — but more a contrarian successor to those great pieces of 60s management wit The Peter Principle and Parkinson’s Law. But, as they say, take it seriously but not literally rather than literally but not seriously.
John Gall analyses the Kafkaesque world of modern management not in terms of “hierarchies”, “power structures” or “paradigms”, but “systems we set up to accomplish some goal”, by dint of which a new entity comes into being: the system itself. And therein lies the problem.
“Now the system itself has to be dealt with. Whereas before there was only the Problem—such as warfare between nations, or garbage collection—there is now an additional universe of problems associated with the functioning or merely the presence of the new system.”
Regular readers may notice a similarity to the JC’s favourite the second-order derivative, by which risk managers substitute the actual risk with a range of systemic or programmatic indicators of the presence of that risk — a system to keep the risk under review, which then obliges you to spend all your time keeping the system in check, auditing its key performance indicators, it might have been better all along to spend your money on some experts to keep an eye on the risk — rather than, you know, laying them off.
In any case, once that proposition is stated, many great insights flows from it.
- A system tends to oppose its own proper function.
- With the introduction of a system, the total number of problems facing the community do not change; they simply change in form and relative importance.
- Systems are persistent: once established they have their own life force and they encroach, expanding to fill the known universe, using the power of positive feedback.
- Complex systems exhibit unexpected behaviour.
- Le Chatelier’s principle: any system tends to set up conditions opposing further operation of the process — as to which see goals and objectives. Especially SMART ones.
- Systems are imposed to correct unexpected problems that have since been solved. They are “fully prepared for the past”
- Temporary patches are very likely to become permanent, and then structural.
- The Naming Fallacy: the very act of naming throws everything into a frame of reference (This is something that Donald Trump has exploited well).
Along the way there are wonderful wry vignettes, that might outrage those who look too LinkedIn for their wisdom and career inspiration:
“We have already noted that a supermarket apple is not like the apple we had in mind, that what comes out of a coffee vending machine is not coffee as we once knew it, and that a person who takes a course in leadership training is acting out a behavioral pattern better described as Following rather than Leading.”
“Prolonged data-gathering is not uncommonly used as a means of not dealing with a problem. ... When so motivated, information-gathering represents a form of Passivity”.
Sometimes, data and oversight gets in the way.