Seeing Like a State: Difference between revisions

Jump to navigation Jump to search
no edit summary
No edit summary
No edit summary
Line 44: Line 44:
This is the difference, says Scott, between Red Adair<ref>Younger readers may not remember this [https://en.wikipedia.org/wiki/Red_Adair legend of the fire-fighting community]. </ref> and an articled clerk. There are some skills you cannot acquire except through experience. Likewise learning to sail, ride a bike, or play a musical instrument etc. You could spend as much time as you like with textbooks, but you will master riding a bike without practical rehearsal.
This is the difference, says Scott, between Red Adair<ref>Younger readers may not remember this [https://en.wikipedia.org/wiki/Red_Adair legend of the fire-fighting community]. </ref> and an articled clerk. There are some skills you cannot acquire except through experience. Likewise learning to sail, ride a bike, or play a musical instrument etc. You could spend as much time as you like with textbooks, but you will master riding a bike without practical rehearsal.


Which brings us to the last connection: that to [[complexity theory]], [[systems analysis]] and [[normal accident]]s theory. All of these come to the same conclusion: if you are dealing with [[complex systems]], especially [[tightly-coupled]] ones with [[non-linear]] interactions, ''you cannot solve these with algorithms, no matter how much data and no matter how sophisticated is your conceptual scheme. The ''only'' way to manage these risks is with experts on the ground, who are empowered to exercise their judgment and make provisional decisions, and to adjust them as a situation unfolds. That is, [[metis]]. If your conceptual scheme has systematically eliminated [[metis]] from your operation, you may carry on in times of peace and equability, but should a crisis come, you are ''stuffed''.
Which brings us to the last connection: that to [[complexity theory]], [[systems analysis]] and [[normal accident]]s theory. All of these come to the same conclusion: if you are dealing with [[complex systems]], especially [[tightly-coupled]] ones with [[non-linear]] interactions, you ''cannot'' solve these with [[algorithm|algorithms]], no matter how much data and no matter how sophisticated is your conceptual scheme. The ''only'' way to manage these risks is with experts on the ground, who are empowered to exercise their judgment and make provisional decisions, and to adjust them as a situation unfolds. That is, with [[metis]]. If your conceptual scheme has systematically eliminated [[metis]] from your operation, you may carry on in times of peace and equability, but should a crisis come, you are ''stuffed''.


{{sa}}
{{sa}}

Navigation menu