It’s not about the bike: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 18: Line 18:
For technology not to get in the way in an existential crisis, it must be make a complicated ''more'' complicated. At the least, it should be positively designed to ''not'' make diagnostics and resolution harder in an emergency. If you have divided your labour correctly, your technology will handle the routine stuff; your [[subject matter expert]]s the exceptions — exactly the scenarios where technology, checklists and prepared [[risk taxonomies]] have nothing to say. Charles Perrow’s account of the control deck at Three Mile Island as, without warning, reactant coolant pumps began cavitating, thumping and shaking, is instructive:
For technology not to get in the way in an existential crisis, it must be make a complicated ''more'' complicated. At the least, it should be positively designed to ''not'' make diagnostics and resolution harder in an emergency. If you have divided your labour correctly, your technology will handle the routine stuff; your [[subject matter expert]]s the exceptions — exactly the scenarios where technology, checklists and prepared [[risk taxonomies]] have nothing to say. Charles Perrow’s account of the control deck at Three Mile Island as, without warning, reactant coolant pumps began cavitating, thumping and shaking, is instructive:


{{quote|“In the control room there were three audible alarms sounding, and many of the 1,600 lights (on-off lights and rectangular displays with some code numbers and letters on them) were on or blinking. The operators did not turn off the main audible alarm because it would cancel some of the annunciator lights. The computer was beginning to run far behind schedule; in fact it took some hours before its message that something might be wrong with the PORV finally got its chance to be printed. Radiation alarms were coming on. The control room was filling with experts; later in the day there were about forty people there. The phones were ringing constantly, demanding information the operators did not have. Two hours and twenty minutes after the start of the accident, a new shift came on.” <ref>{{br|Normal Accidents}} p. 28.</ref> }
{{quote|“In the control room there were three audible alarms sounding, and many of the 1,600 lights (on-off lights and rectangular displays with some code numbers and letters on them) were on or blinking. The operators did not turn off the main audible alarm because it would cancel some of the annunciator lights. The computer was beginning to run far behind schedule; in fact it took some hours before its message that something might be wrong with the PORV finally got its chance to be printed. Radiation alarms were coming on. The control room was filling with experts; later in the day there were about forty people there. The phones were ringing constantly, demanding information the operators did not have. Two hours and twenty minutes after the start of the accident, a new shift came on.” <ref>{{br|Normal Accidents}} p. 28.</ref>}}


When the system seems on the brink of catastrophe and the most articulate question you can form about the situation is ''what the fuck is going on?'' you do ''not'' want to be unplugging, decoding, or working around non-functional safety mechanisms.
When the system seems on the brink of catastrophe and the most articulate question you can form about the situation is ''what the fuck is going on?'' you do ''not'' want to be unplugging, decoding, or working around non-functional safety mechanisms.


 
This is another reason for simplification of processes and documentation to be your first step.
This is another reason for simplification of processes and documentation to be your first step