It’s not about the bike

Revision as of 19:35, 25 March 2021 by Amwelladmin (talk | contribs)

Having been traumatised by compulsory inter-house cross-countries in his youth — best ever placing: sixth-last — the JC has always loathed races of any kind, resented those who are good at them, and wallowed in any schadenfreude going should the foot-race winners of the world come unstuck.

The design of organisations and products
The actual problem, yesterday.
Index: Click to expand:
Tell me more
Sign up for our newsletter — or just get in touch: for ½ a weekly 🍺 you get to consult JC. Ask about it here.

Look, exercise is important, but it is something one should do alone, anonymously, under cover of darkness if possible, and in disguise if not.

So when the question arises how should one improve athletic performance the JC is, well — the sixth-last person in the world you should ask. But he has a fondness for metaphors[1] and he spies a good’un here. For the same principles that would apply to elite performance apply to process optimisation of any kind. Now it is somewhat galling to be co-opt the words of Lance Armstrong of all people, but here goes:

It’s not about the bike. It’s about the pies.

There are two ways to lose three hundred grams from your loaded frame weight: upgrade to kevlar forks and graphene spokes, at a cost of twenty grand, or lay off the pies.

So what has this got to do with legal design? Well, throwing voguish tech at your existing process is like upgrading to kevlar forks instead of getting on the treadmill. Time for another down-home JC-branded Latin maxim, readers: primum comede minus: “First, cut out the pies.”

You will lose ten kilos and save money — on pies, right? — and your current bike will work better. You won’t have to pedal so hard. You might conclude that kevlar forks are a bit of a waste of money. Anything you automate is, necessarily, low value: because you make it low value by automating it.

Automating might give you a short-term productivity bump, but you’ll rapidly bank it and, anyway, if you can automate a process, so can anyone else. And then there are the downstream costs. Not just the rent extracted by the software vendor, the internal bureaucratic overhead in maintaining, auditing, approving and renewing the software, training legal users, updating the content — the knock-on pain of solving a problem which wasn’t, actually, that you needed Kevlar forks, but that you needed to go on a diet and get in shape.

System accidents

And this is not to mention the problem of figuring out what to do if there’s a system accident.

Technology helps you in the business as usual, when you are tilling fields, your fences are sound, your boundaries known, the range of outcomes understood, catered for and all is well in the world. Technology won’t help you when a black swan arrives and starts dive bombing the conservatory. Your best outcome is that when you hit defcon one, your tech doesn’t get in the way.

For technology not to get in the way in an existential crisis, it must be make a complicated more complicated. At the least, it should be positively designed to not make diagnostics and resolution harder in an emergency. If you have divided your labour correctly, your technology will handle the routine stuff; your subject matter experts the exceptions — exactly the scenarios where technology, checklists and prepared risk taxonomies have nothing to say. Charles Perrow’s account of the control deck at Three Mile Island as, without warning, reactant coolant pumps began cavitating, thumping and shaking, is instructive:

“In the control room there were three audible alarms sounding, and many of the 1,600 lights (on-off lights and rectangular displays with some code numbers and letters on them) were on or blinking. The operators did not turn off the main audible alarm because it would cancel some of the annunciator lights. The computer was beginning to run far behind schedule; in fact it took some hours before its message that something might be wrong with the PORV finally got its chance to be printed. Radiation alarms were coming on. The control room was filling with experts; later in the day there were about forty people there. The phones were ringing constantly, demanding information the operators did not have. Two hours and twenty minutes after the start of the accident, a new shift came on.” [2]

When the system seems on the brink of catastrophe and the most articulate question you can form about the situation is what the fuck is going on? you do not want to be unplugging, decoding, or working around non-functional safety mechanisms.

So, fundamental system design principle: first, cut out the pies.

All other things being equal, the optimum amount of technology to have in a given situation is none. Tech necessarily adds complication, cost and confusion. Therefore your first question is: how will technology improve the situation. Ask not just in terms of reduced cost but reduced waste.


See also

References

  1. And pies.
  2. Normal Accidents p. 28.