It’s not about the bike: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 11: Line 11:


Automating might give you a short-term productivity bump, but you’ll rapidly bank it and, anyway, if ''you'' can automate a process, so can anyone else. And then there are the downstream costs. Not just the [[Rent-seeking|rent extracted]] by the software vendor, the internal bureaucratic overhead in maintaining, auditing, approving and renewing the software, training legal users, updating the content — the knock-on pain of solving a problem which wasn’t, actually, that you needed Kevlar forks, but that ''you needed to go on a diet and get in shape''.  
Automating might give you a short-term productivity bump, but you’ll rapidly bank it and, anyway, if ''you'' can automate a process, so can anyone else. And then there are the downstream costs. Not just the [[Rent-seeking|rent extracted]] by the software vendor, the internal bureaucratic overhead in maintaining, auditing, approving and renewing the software, training legal users, updating the content — the knock-on pain of solving a problem which wasn’t, actually, that you needed Kevlar forks, but that ''you needed to go on a diet and get in shape''.  
===[[System accident]]s
And this is not to mention the problem of figuring out what to do if there’s a [[system accident]].


And this is not to mention the problem of figuring out what to do if there’s a system accident.
[[Technology]] helps you in the  [[business as usual]], when you are tilling fields, your fences are sound, your boundaries known, the range of outcomes understood, catered for and all is well in the world. Technology ''won’t'' help you when a [[black swan]] arrives and starts dive bombing the conservatory. Your best outcome is that when you hit defcon one, your tech ''doesn’t get in the way''. 
 
For technology not to get in the way in an existential crisis, it must be make a complicated ''more'' complicated. At the least, it should be positively designed to ''not'' make diagnostics and resolution harder in an emergency. If you have divided your labour correctly, your technology will handle the routine stuff; your [[subject matter expert]]s the exceptions — exactly the scenarios where technology, checklists and prepared [[risk taxonomies]] have nothing to say. Charles Perrow’s account of the control deck at Three Mile Island as, without warning, reactant coolant pumps began cavitating, thumping and shaking, is instructive:
 
{{quote|“In the control room there were three audible alarms sounding, and many of the 1,600 lights (on-off lights and rectangular displays with some code numbers and letters on them) were on or blinking. The operators did not turn off the main audible alarm because it would cancel some of the annunciator lights. The computer was beginning to run far behind schedule; in fact it took some hours before its message that something might be wrong with the PORV finally got its chance to be printed. Radiation alarms were coming on. The control room was filling with experts; later in the day there were about forty people there. The phones were ringing constantly, demanding information the operators did not have. Two hours and twenty minutes after the start of the accident, a new shift came on.” <ref>{{br|Normal Accidents}} p. 28.</ref> }
 
When the system seems on the brink of catastrophe and the most articulate question you can form about the situation is ''what the fuck is going on?'' you do ''not'' want to be unplugging, decoding, or working around non-functional safety mechanisms.
 
 
This is another reason for simplification of processes and documentation to be your first step