Bad apple
Bad apple
/bæd ˈæpl/ (n.)
One of those mischievous human imps occupying unobserved crevices in the great steampunk machine who, by their human frailty, ruin the best-laid plans of the machines.
The JC’s amateur guide to systems theory™
|
On the conventional wisdom, bad apples are the sole remaining fly in the ointment separating us from the sunlit uplands of financial services utopia that our patient labours by now have surely earned. Once the last bad apple has been rooted out all will be well in perpetuity.
It’s not clear what we’ll all then do, but this is but a quibble.
The JC ponders human nature a lot, as you know. He wonders whether we should be quite so credulous. Is not the barrel of bad apples bottomless? Aren’t bad apples just gonna be bad?
Would we not be better worrying less about curing humans of their nature, and more about neutralising its unwanted effects?
For there will always be bad apples, and they will always seek out, find and exploit zero-day flaws in the system. We should expect this, because it is in their — our —nature. what which is This is what bad apples do.
Bad apples will find zero-day vulnerabilities exactly where the system least expects them, and is therefore paying least attention: ostensibly harmless, sleepy backwaters. LIBOR submissions. The accounting department. The Delta-one index swaps desk. In a family office.
The question is not where are all the bad apples as much as where are all the zero-day vulnerabilities they will surely exploit?
And the more byzantine, multi-dimensional, formalised, technology-overlaid and complex our system becomes, the more vulnerabilities it will have, and the harder it will be to find them, should they start playing up.
Leaving it to “the system” to detect and destroy bad apples — by policy attestation, outsourced compliance personnel in Manila reading from playbook, “A.I.-powered” software applications — is the Bond villain’s way of despatching an enemy: you tie it up and leave it unattended while a nasty-looking, but plainly fallible, clockwork machine counts down from a thousand.
In the meantime these elaborate risk control systems tend to snare peaceable, but ignorant, citizens as they go about their quotidian day, while the bad apples, wise to the ways of the world, have already worked out the flaws and work-arounds.
How to spot a bad apple
The regrettable thing about bad apples is this: they have a habit of looking like boring functionaries, or even the good guys, right up to the moment that they don’t.
Before you know it’s a bad apple, a good bad apple doesn’t look like a bad apple. Bad bad apples look like bad apples, so they quickly get rooted out by good apples. Even a bad good apple can spot a bad bad apple.
But good bad apples: well, Q.E.D., no-one believes they are bad apples. That’s what’s so good about them.
Hence, our controversial proposal: A bad apple that doesn’t look like a bad apple isn’t a bad apple.
So it seems to us it won’t really do to say we must be better at spotting bad apples — thereby spreading by association the stigma of bad appledom on those mediocre apples who failed to spot them. Why did they not notice perfidy going on around them? Are they on commonly stupid, or or have their bad apple detectors somehow been disarmed?
Might they have been disarmed by process? To test this hypothesis consider what happens to those within our formalistic system who do call out bad apples. People like Bethany MacLean, Harry Markopolos, Erin Arvedlund, Dan McCrum, and that junior credit officer at Credit Suisse who said, of Archegos, “I need to understand purpose of having daily termination rights ... if client is not amenable to us using those rights.”
These people are regarded, before the fact, as bad apples. Not bad bad apples,[1] but impertinent: irritants; turbulent priests the place would be better off without. Meanwhile the real bad apples carried on with their heroic poses — NASDAQ chairmen, Bank chairmen, visionary innovators, star traders. They only started to look like bad apples after it.
Before and after fact: a play in two acts
Quiz time: taking the information supplied about who everyone thought was the hero, or bad apple, before a celebrated financial markets catastrophe, fill in who you think it might have turned out to be after the event.
Incident | Before | After | ||
---|---|---|---|---|
Hero | Bad Apple | Hero | Bad Apple | |
Enron | Jeff Skilling Ken Lay Andrew Fastow |
Fortune Journalist Bethany MacLean Short-seller Jim Chanos |
_______ | _______ |
Madoff | Bernie Madoff Fairfield Sentry The SEC |
Option Trader Harry Markopolos Barron’s Journalist Erin Arvedlund |
_______ | _______ |
Barings | Nick Leeson Peter “not terribly difficult” Baring |
Er... | _______ | _______ |
Archegos | Bill Huang Co-heads of PB, everywhere |
Junior credit officer | _______ | _______ |
FTX | Sam Bankman-Fried Caroline Ellison |
Matt “So, it’s a ponzi scheme?” Levine | _______ | _______ |
WireCard | Markus Braun Jan Marsalek BaFin |
FT Journalist Dan McCrum Internal lawyer Pav Gill Short-seller Matthew Earl |
_______ | _______ |
The JC’s view: the “bad apple” concept is not a good one if the virtue of one’s applehood is only apparent in hindsight.
What to do
All of this hindsight-coloured hand-wringing is good sport, but what to do about it? Regular readers might not be surprised to hear the JC say that deprogramming the steampunk machine and asking people to use their experience, judgment and intuition to ask unusual questions.
It isn’t hard to imagine the scene: a weekly operational risk meeting with a standing agenda, designed systemically and mechanically to canvas and manage risks to the business.
At this meeting senior “stakeholders” will discuss cash breaks, outstanding undocumented confirms, position concentrations across the book. Things like that. The inevitable procedural glitches of life in a complicated modern financial services business. All kinds of metrics will be presented and analysed, laid out in graphs, charts and data tables. A dashboard of “high risk” clients, derived from these operational metrics, may be presented, but the RAG array will read uniform green — perhaps studded with the odd amber — an easily-addressed talking point included “for good order” but, we are assured, no materially elevated risk of loss.
It will be like this because we are acculturated to be in control, for systems to be operating, in good standing, and all engines ticking along without significant strain. We have been acclimatized to believe that the greatest sin is to disrespect process.
But what good is a risk report designed to tell you everything is under control? What real world function does this fulfil?
You ask, “did Malachite appear on any risk reports in the two years leading to its collapse? Did Archegos? Did Amaranth?”
But these are rhetorical questions, and you don't ask them lest you become the bad apple.
But imagine if the agenda were different: who are your top five riskiest clients? Who are you most worried? Consider size, trading history, operational sophistication, timeliness, responsiveness.
Who, in your bones, makes you feel most nervous. Are they the same clients as last week? Last month?
Then ask sales: who is printing the most business? Who is generating the most revenue? Whose portfolio is doing the best? Who in your bones do you trust the least?
Are they the same clients as last week? Last month?
Ask risk: who has them most leverage? Where are the concentrated positions? Who has the thinnest liquity? The least equity? Whose docs, and margin lockups are the most severe?
Have all risk control and business groups discuss these observations together. Do it in person. No decjs, no blackberries, no interruptions. Require everyone to engage. Everyone should contribute. Every one should know each others fundamental parameters. Everyone should be interested. Ask: is these are our biggest risks, what would we do differently?
The point of a risk meeting is surely not to persuade the steerco that all is well —, that RAGs are green — but where the risk is most likely to be. We should be looking for amber lights, not burying them. If you can’t think of any, that is not a sign all is well: it is a sign you are not doing your job.
There is risk. Have a considered theory as to where it might be
See also
- Human error
- Sidney Dekker’s The Field Guide to Human Error Investigations
- Rumours of our demise are greatly exaggerated
L
- ↑ Though Dan McCrum was subject to a criminal investigation, so he might feel differently about that.