It’s not about the bike
Having been traumatised by compulsory inter-house cross-countries in his youth — best ever placing: sixth-last — the JC has always loathed races of any kind, resented those who are good at them, and revelled at any schadenfreude going should the foot-race winners of the world come a-cropper. Look, exercise is important, but it is something one should do alone, anonymously, under cover of darkness if possible, and in disguise if not. So it is somewhat galling to be repeating Lance Armstrong’s words, but here goes:
The design of organisations and products
|
It’s not about the bike.
There are two ways to lose three hundred grams from your loaded frame weight: upgrade to kevlar forks and graphene spokes, at a cost of a few grand, or cut out all the pies.
So what has this got to do with legal design? Well, automating your existing process as it is, is like upgrading to kevlar forks instead of getting on the treadmill. Time for another down-home JC-branded Latin maxim, readers: primum comede minus: “First, cut out the pies.”
You will lose ten kilos and save money — on pies, right? — and your current bike will work better. You won’t have to pedal so hard. You might conclude that kevlar forks are a bit of a waste of money. Anything you automate is, necessarily, low value: because you make it low value by automating it.
Automating might give you a short-term productivity bump, but you’ll rapidly bank it and, anyway, if you can automate a process, so can anyone else. And then there are the downstream costs. Not just the rent extracted by the software vendor, the internal bureaucratic overhead in maintaining, auditing, approving and renewing the software, training legal users, updating the content — the knock-on pain of solving a problem which wasn’t, actually, that you needed Kevlar forks, but that you needed to go on a diet and get in shape.
System accidents
And this is not to mention the problem of figuring out what to do if there’s a system accident.
Technology helps you in the business as usual, when you are tilling fields, your fences are sound, your boundaries known, the range of outcomes understood, catered for and all is well in the world. Technology won’t help you when a black swan arrives and starts dive bombing the conservatory. Your best outcome is that when you hit defcon one, your tech doesn’t get in the way.
For technology not to get in the way in an existential crisis, it must be make a complicated more complicated. At the least, it should be positively designed to not make diagnostics and resolution harder in an emergency. If you have divided your labour correctly, your technology will handle the routine stuff; your subject matter experts the exceptions — exactly the scenarios where technology, checklists and prepared risk taxonomies have nothing to say. Charles Perrow’s account of the control deck at Three Mile Island as, without warning, reactant coolant pumps began cavitating, thumping and shaking, is instructive:
“In the control room there were three audible alarms sounding, and many of the 1,600 lights (on-off lights and rectangular displays with some code numbers and letters on them) were on or blinking. The operators did not turn off the main audible alarm because it would cancel some of the annunciator lights. The computer was beginning to run far behind schedule; in fact it took some hours before its message that something might be wrong with the PORV finally got its chance to be printed. Radiation alarms were coming on. The control room was filling with experts; later in the day there were about forty people there. The phones were ringing constantly, demanding information the operators did not have. Two hours and twenty minutes after the start of the accident, a new shift came on.” [1]
When the system seems on the brink of catastrophe and the most articulate question you can form about the situation is what the fuck is going on? you do not want to be unplugging, decoding, or working around non-functional safety mechanisms.
This is another reason for simplification of processes and documentation to be your first step.
See also
References
- ↑ Normal Accidents p. 28.