Bad apple: Difference between revisions
Amwelladmin (talk | contribs) No edit summary |
Amwelladmin (talk | contribs) No edit summary |
||
(29 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
{{a|systems|}}One of those mischievous human imps occupying unobserved crevices in the great steampunk machine who, by | {{a|systems|{{image|bad apple|jpg|}}}}{{dpn|/bæd ˈæpl/|n|}}One of those mischievous human imps occupying unobserved crevices in the great steampunk machine who, by human frailty, ruins the best-laid plans of the machines. Bad apples need not be mendacious, ill-spirited or even conscious, but often are. [[Bernie Madoff|Bernard Madoff]] was a bad apple, but so was the [[GameStop]] share rally, and Citigroup’s archaic [[Citigroup v Brigade Capital Management|loan servicing software]]. | ||
On the | On the conventional wisdom, [[bad apple]]s are the last remaining fly in the ointment. They alone keep us from the sunlit uplands of financial services utopia that our collected labours have surely earned. Once the last one has been rooted out, all will be well. | ||
It’s not clear what we’ll all then ''do'', but this is surely just a quibble: the problem we would love to have. | |||
''' | === Bad apples and complex systems === | ||
The JC likes to ponder human nature, however inexpertly. He wonders whether we should be quite so credulous. Is not the barrel of bad apples ''bottomless''? Aren’t ''bad apples just gonna be bad''? | |||
''' | Would we not be better worrying less about ''curing'' humans of their basic nature, and more about ''neutralising'' its unwanted effects? | ||
For there will ''always'' be bad apples, and they will always seek out, find and exploit [[Zero-day vulnerability|zero-day flaws]] in our fragile systems. We should expect this, because it is in their — ''our'' —nature. ''[[Air crashes v financial crashes|This is what bad apples do]]''. | |||
Bad apples will find [[Zero-day vulnerability|zero-day vulnerabilities]] exactly where we least expect them, and are therefore paying least attention: ostensibly harmless, sleepy backwaters. [[LIBOR]] submissions. [[Enron|The accounting department]]. [[Citigroup v Brigade Capital Management|The outsourced loan servicing team in Bangalore]]. [[Kweku Abodoli|The delta-one index swaps desk]]. A [[Archegos|family office]]. | |||
The question is not “where are all the bad apples?” but “where are all the [[Zero-day vulnerability|zero-day vulnerabilities]] they will surely exploit?” | |||
The answer: ''no-one knows''. | |||
And the more byzantine, multi-dimensional, formalised, technology-overlaid and ''complex'' our systems become, the ''more vulnerabilities there will be'', and the harder they will be to find, when they start playing up. | |||
Leaving it to “the system” to detect and destroy bad apples — by policy attestation, outsourced compliance teams reading from [[playbook|playbooks]], “[[Chatbot|A.I.-powered]]” software applications — is surely the Bond villain’s way of despatching an enemy: you tie it up, gloat for a while, deliver a quick monologue and then leave it unattended while a nasty-looking, but plainly fallible, clockwork machine counts down from a thousand. | |||
In the meantime, and while the risk control gin-traps snare other passing, peaceable, but ignorant, citizens as they go about their quotidian day, those bad old apples, wise to the world, have long since untied their bonds and made stealthily away. | |||
The regrettable thing about bad apples is their habit of looking like boring functionaries, or even good guys, right up to the moment that they ''don’t''. | |||
=== ''Good'' bad apples and ''bad'' bad apples === | |||
Before you ''know'' it’s a bad apple, a ''good'' bad apple doesn’t ''look'' like a bad apple. Sure: ''bad'' bad apples look like bad apples; they quickly get rooted out by good apples. Even a bad good apple can spot a bad bad apple. | |||
But ''good'' bad apples: well, [[Q.E.D.]], no-one ''believes'' they are bad apples. That’s what’s so ''good'' about them. | |||
Hence, our controversial proposal: A good bad apple, that doesn’t ''look'' like a bad apple, ''isn’t a bad apple''. | |||
It won’t do to say “''our good apples must be better at spotting bad apples''” if, at the time of looking, our bad apples look like good apples. That would only spread, by association, the stigma of bad appledom to our good apples. And if bad ''good'' apples look like ''bad'' apples, while our good ''bad'' apples look like ''good'' apples, you can see we are in a pretty fix. | |||
We should ask ''why'' did no-one spot them? Are are our good apples just not that good, or or have their “bad apple detectors” somehow been disarmed? | |||
Might they have been disarmed by ''process''? | |||
To test this idea, consider what happens to those ''good apples'' within our formal systems who ''do'' spot the bad apples, and who call them out. People like journalist [[Enron Corporation|Bethany MacLean]], [[Bernie Madoff|Erin Arvedlund]] and [[WireCard|Dan McCrum]], options trader [[Harry Markopolos]] and that poor junior credit officer at [[Archegos|Credit Suisse]] who was told off for asking “why do we even have daily termination rights if the client is not amenable to us using them?” | |||
As soon as they said their piece these people became, before the fact, bad apples. Not ''bad'' bad apples,<ref>Though Dan McCrum was subject to a criminal investigation, so he might feel differently about that.</ref> sure, but ''impertinent'' bad apples: ''impolitic'' apples; ''irritating'' apples; ''turbulent'' apples the place would be better off without. | |||
Meanwhile the ''real'' bad apples carried on with their heroic poses — [[Bernie Madoff|NASDAQ chairmen]], [[Barings Bank|Bank chairmen]], [[Enron Corporation|visionary innovators]], [[Kweku Abodoli|star traders]] — as ''good'' good apples. They only started to look like bad apples ''after'' it. | |||
=== Before and after fact: a play in two acts=== | |||
Quiz time: taking the information supplied about who everyone thought was the hero, or bad apple, ''before'' a celebrated financial markets catastrophe, fill in who you think it might have turned out to be ''after'' the event. | |||
{{Tabletopflex|100}} | |||
|+ The [[JC]]’s famous “[[guess the bad apple]]”{{tm}} game | |||
|- | |||
! rowspan="2" |Incident!! colspan="2" |Before!! colspan="2" |After | |||
|- | |||
! |Hero||Bad Apple||Hero||Bad Apple | |||
{{aligntop}} | |||
| [[Enron]] ||Jeff Skilling<br>Ken Lay<br>Andrew Fastow || Fortune Journalist Bethany MacLean<br>Short-seller Jim Chanos || _______ || _______ | |||
{{aligntop}} | |||
| [[Madoff]] || [[Bernie Madoff]]<br>[[Fairfield Sentry]]<br>The [[SEC]]||Option Trader [[Harry Markopolos]] <br>Barron’s Journalist Erin Arvedlund || _______ || _______ | |||
{{aligntop}} | |||
| [[Barings Bank|Barings]]|| Nick Leeson<br>Peter “not terribly difficult” Baring || Er... || _______ || _______ | |||
{{aligntop}} | |||
| [[Archegos]] || Bill Huang <br>Co-heads of [[PB]], everywhere || Junior credit officer, [[Credit Suisse]] || _______ || _______ | |||
{{aligntop}} | |||
| [[FTX]] || [[Sam Bankman-Fried]]<br>Caroline Ellison || Matt “So, it’s a ponzi scheme?” Levine<br>Terry Duffy (CME CEO) || _______ || _______ | |||
{{aligntop}} | |||
| [[WireCard]]|| Markus Braun<br>Jan Marsalek<br>[[BaFin]]|| FT Journalist Dan McCrum<br>Internal lawyer Pav Gill<br>Short-seller Matthew Earl || _______ || _______ | |||
|} | |||
The “bad apple” concept is not a good one if you can only gauge a fellow’s applehood ''in hindsight''. | |||
=== The role of process in all of this === | |||
Now. Hindsight-powered hand-wringing is all good sport, but what to do about it? | |||
Regular readers will not be surprised to hear the JC say that ''deprogramming the steampunk machine'' and asking people to use their skill, judgment and experience might be part of it. | |||
''Ask searching questions''. | |||
But asking searching questions is not how modern organisations like to work. They are instead designed to give the impression of this kind of governance, while delivering nothing of the kind. This is how management by committee works. | |||
===Enter the [[opco|Opco]]=== | |||
<div class="italic">{{quote| | |||
{{opco scene setter}}}} | |||
</div> | |||
In any case, the Opco will methodically plough through each department’s slides, which all will tell variations of the same story: in the main, ''plain sailing'' but, by way of colour, the odd fixable glitch in [[process]] — nothing serious; just the inevitable operational snags of modern financial services — and for those, a remediation plan, already in train, for how they will be resolved. | |||
All kinds of [[metric]]<nowiki/>s will be presented, analysed and set out in voluminous graphs, charts and data tables. There may be a dashboard of “high risk” situations — but only ones numerically derived from [[metric]]<nowiki/>s. Its [[RAG]] array will read, mainly, uniform ''green''. Perhaps the odd amber, for the sake of punctuation, attesting to easily-addressed low-impact hazards to be included “for good order” and with confident assurances there is elevated risk of loss. | |||
It will be like this because we are enculturated to always need to be in ''control'', for all systems to be ''go'', all processes in good standing, all engines ticking over without significant strain. We have been acclimatised to believe that the greatest sin is ''to'' ''disrespect'' ''[[process]]''. If you disrespect process, you ''can'' be blamed. If you don’t, you ''can’t''. | |||
But what use is a risk process that is designed to tell you everything is under control? | |||
Did [[Long-Term Capital Management|LTCM]] appear on broker risk reports before it collapsed? Did [[Amaranth]]? Did [[Malachite]], or [[Archegos]]? We ''hope'' the answer here is “no,” because ''that means there’s a bad apple''. If it was “yes,” and no-one intervened, then ''the system has broken down''. And someone is going to get shot. | |||
We can see here how, curiously, how a ''good'' bad apple — the kind that is so good that no-one can be blamed for not having noticed it — is, for the prospects of those who manage [[operating committee]]<nowiki/>s, a kind of ''good'' apple, in that it presents a pass; an alibi; an excuse for being none the wiser. | |||
What tawdry games we play. | |||
===The Opco, reimagined=== | |||
Forgive us for a moment of science-fiction.<ref>Strictly speaking, [[finance fiction]]. The genres are related.</ref> | |||
Imagine a different kind of Opco. A ''fantasy'' Opco, designed not to protect the posteriors of those at its helm with plausible deniability, but designed to actually look for concealed bad apples. | |||
Have all risk control and business groups discuss these observations ''together''. Do it in person. No [[Microsoft PowerPoint|deck]]<nowiki/>s, no BlackBerries, no-one phoning in. No interruptions. Put on lunch. Open minds. No eviscerations. Open minds. Invite people across the ranks, but leave [[Managing director|titles]] at the door. Everyone should engage. Everyone should contribute. | |||
On this one, the standing agenda is simple: ask each delegate the same set of open questions and throw the answer to the floor: not by way of cross-examination, but by way of open-eared enquiry, to consider things that might have escaped the [[Legibility|executive’s attention]]. | |||
''What is on your mind?'' | |||
''What are you worrying most about?'' | |||
''What should '''we''' worry most about?'' | |||
''What are we missing?'' | |||
''What are your top five concerns to the stability and profitability of the bank?'' | |||
Sometimes the most counterintuitive questions might provide food for thought. | |||
''Which client is printing the most business?'' | |||
''Who is generating the most revenue?'' | |||
''Who is borrowing the most money?'' | |||
''Who is generating the most commission?'' | |||
''Who diverges most from the pack?'' | |||
''Whose performance seems too good to be true?'' | |||
''Who has the most leverage?'' | |||
''Who has the biggest positions?'' | |||
''Which are the most concentrated names?'' | |||
''Where is the thinnest liquidity?'' | |||
''Whose docs, and margin lockups are the most severe?'' | |||
Note: a lot of names on the JC’s nascent [[financial disasters roll of honour]] would have come up: Archegos. LTCM. Enron. Greensill. Abraaj. Amaranth. Wouldn’t ''that'' be a more effective way of surfacing bad apples? | |||
===And then he woke up and it was all a dream=== | |||
Of course, it is preposterous that any self-respecting management committee would work like that. The reasons why are as immutable, and predictable, as they are misplaced. The firm is a self-perpetuating autocracy. You do not run autocracies by empowering subordinates. Who knows what kind of silly things they would say. | |||
====Regulators would puke==== | |||
We have to submit the minutes of our risk meetings to the regulator. They have great powers to demand further information from us. The last thing we want is to have them asking difficult questions. We wish to create the impression of calm, ordered, measured, ''control''. Nothing to see here folks, move along. We can’t afford to give any kind of impression there are things we do not know, things we cannot manage, or things about which we are worried in our business. Encouraging coal-face staff to indulge their paranoid fantasies is the last thing we should do. | |||
There is certainly sense in this, but it insane all the same. The best way of managing our regulator — a body whose existential purpose is to manage risk of catastrophe — is to be wilfully blind to the risk of catastrophe. | |||
====Senior management would puke==== | |||
Even if the regulators would be cool with it — they wouldn’t — the of the Opco chair would not. This is the [[The Ten Commandments of Cross Examination|golden rule of cross-examination]]: ''don’t ask questions to which you don’t know the answer''. | |||
“Imagine,” she might say, “if someone flagged this kind of crazy risk in a risk meeting, and we discussed it, and we decided to do nothing about it, and then that exact crazy risk happened. Management would be incandescent. We might get disciplined. Or fined. Or even fired.” | |||
Rightly. | |||
== Further reading == | |||
[[Sidney Dekker]] has written persuasively about management’s habit of blaming operator error.<ref>{{Fieldguide}}</ref> The late [[Charles Perrow]] wrote brilliantly<ref>[[Charles Perrow]], [[Normal Accidents: Living with High-Risk Technologies|''Normal Accidents: Living with High-Risk Technologies'']]</ref> about system accidents. James C. Scott writes fabulously about the executives blindness to critical informal systems. | |||
There is a compelling case that organisations can do more to empower experts, declutter systems and control mechanisms and delayer middle management to improve practical risk management. | |||
{{sa}} | {{sa}} | ||
*[[Human error]] | *[[Human error]] | ||
*{{fieldguide}} | *{{fieldguide}} | ||
*[[Rumours of our demise are greatly exaggerated]] | *[[Rumours of our demise are greatly exaggerated]] | ||
{{Ref}}{{nlp}} |
Latest revision as of 15:10, 26 May 2024
The JC’s amateur guide to systems theory™
|
Bad apple
/bæd ˈæpl/ (n.)
One of those mischievous human imps occupying unobserved crevices in the great steampunk machine who, by human frailty, ruins the best-laid plans of the machines. Bad apples need not be mendacious, ill-spirited or even conscious, but often are. Bernard Madoff was a bad apple, but so was the GameStop share rally, and Citigroup’s archaic loan servicing software.
On the conventional wisdom, bad apples are the last remaining fly in the ointment. They alone keep us from the sunlit uplands of financial services utopia that our collected labours have surely earned. Once the last one has been rooted out, all will be well.
It’s not clear what we’ll all then do, but this is surely just a quibble: the problem we would love to have.
Bad apples and complex systems
The JC likes to ponder human nature, however inexpertly. He wonders whether we should be quite so credulous. Is not the barrel of bad apples bottomless? Aren’t bad apples just gonna be bad?
Would we not be better worrying less about curing humans of their basic nature, and more about neutralising its unwanted effects?
For there will always be bad apples, and they will always seek out, find and exploit zero-day flaws in our fragile systems. We should expect this, because it is in their — our —nature. This is what bad apples do.
Bad apples will find zero-day vulnerabilities exactly where we least expect them, and are therefore paying least attention: ostensibly harmless, sleepy backwaters. LIBOR submissions. The accounting department. The outsourced loan servicing team in Bangalore. The delta-one index swaps desk. A family office.
The question is not “where are all the bad apples?” but “where are all the zero-day vulnerabilities they will surely exploit?”
The answer: no-one knows.
And the more byzantine, multi-dimensional, formalised, technology-overlaid and complex our systems become, the more vulnerabilities there will be, and the harder they will be to find, when they start playing up.
Leaving it to “the system” to detect and destroy bad apples — by policy attestation, outsourced compliance teams reading from playbooks, “A.I.-powered” software applications — is surely the Bond villain’s way of despatching an enemy: you tie it up, gloat for a while, deliver a quick monologue and then leave it unattended while a nasty-looking, but plainly fallible, clockwork machine counts down from a thousand.
In the meantime, and while the risk control gin-traps snare other passing, peaceable, but ignorant, citizens as they go about their quotidian day, those bad old apples, wise to the world, have long since untied their bonds and made stealthily away.
The regrettable thing about bad apples is their habit of looking like boring functionaries, or even good guys, right up to the moment that they don’t.
Good bad apples and bad bad apples
Before you know it’s a bad apple, a good bad apple doesn’t look like a bad apple. Sure: bad bad apples look like bad apples; they quickly get rooted out by good apples. Even a bad good apple can spot a bad bad apple.
But good bad apples: well, Q.E.D., no-one believes they are bad apples. That’s what’s so good about them.
Hence, our controversial proposal: A good bad apple, that doesn’t look like a bad apple, isn’t a bad apple.
It won’t do to say “our good apples must be better at spotting bad apples” if, at the time of looking, our bad apples look like good apples. That would only spread, by association, the stigma of bad appledom to our good apples. And if bad good apples look like bad apples, while our good bad apples look like good apples, you can see we are in a pretty fix.
We should ask why did no-one spot them? Are are our good apples just not that good, or or have their “bad apple detectors” somehow been disarmed?
Might they have been disarmed by process?
To test this idea, consider what happens to those good apples within our formal systems who do spot the bad apples, and who call them out. People like journalist Bethany MacLean, Erin Arvedlund and Dan McCrum, options trader Harry Markopolos and that poor junior credit officer at Credit Suisse who was told off for asking “why do we even have daily termination rights if the client is not amenable to us using them?”
As soon as they said their piece these people became, before the fact, bad apples. Not bad bad apples,[1] sure, but impertinent bad apples: impolitic apples; irritating apples; turbulent apples the place would be better off without.
Meanwhile the real bad apples carried on with their heroic poses — NASDAQ chairmen, Bank chairmen, visionary innovators, star traders — as good good apples. They only started to look like bad apples after it.
Before and after fact: a play in two acts
Quiz time: taking the information supplied about who everyone thought was the hero, or bad apple, before a celebrated financial markets catastrophe, fill in who you think it might have turned out to be after the event.
Incident | Before | After | ||
---|---|---|---|---|
Hero | Bad Apple | Hero | Bad Apple | |
Enron | Jeff Skilling Ken Lay Andrew Fastow |
Fortune Journalist Bethany MacLean Short-seller Jim Chanos |
_______ | _______ |
Madoff | Bernie Madoff Fairfield Sentry The SEC |
Option Trader Harry Markopolos Barron’s Journalist Erin Arvedlund |
_______ | _______ |
Barings | Nick Leeson Peter “not terribly difficult” Baring |
Er... | _______ | _______ |
Archegos | Bill Huang Co-heads of PB, everywhere |
Junior credit officer, Credit Suisse | _______ | _______ |
FTX | Sam Bankman-Fried Caroline Ellison |
Matt “So, it’s a ponzi scheme?” Levine Terry Duffy (CME CEO) |
_______ | _______ |
WireCard | Markus Braun Jan Marsalek BaFin |
FT Journalist Dan McCrum Internal lawyer Pav Gill Short-seller Matthew Earl |
_______ | _______ |
The “bad apple” concept is not a good one if you can only gauge a fellow’s applehood in hindsight.
The role of process in all of this
Now. Hindsight-powered hand-wringing is all good sport, but what to do about it?
Regular readers will not be surprised to hear the JC say that deprogramming the steampunk machine and asking people to use their skill, judgment and experience might be part of it.
Ask searching questions.
But asking searching questions is not how modern organisations like to work. They are instead designed to give the impression of this kind of governance, while delivering nothing of the kind. This is how management by committee works.
Enter the Opco
Imagine the scene: a monthly risk operating committee meeting with a standing agenda designed systemically and mechanically to identify minimise and manage risks to the business. Some snivelling COO functionary will have spent the preceding fortnight issuing progressively pointed warnings to “stakeholders” that their contributions to the 300-page deck that will serve as materials for the meeting — whose existence is mandated by the committee’s terms of reference — and whose target operating model demands be circulated 48 hours in advance.
Not a soul will have read these materials before the meeting — it wouldn’t be physically possible at the average adult reading speed — and nor would one be any wiser if she had: the COO’s muted threats are just the weft and warp of the financial services dominance display. It is all very performative. As, indeed, is the deck.
Each risk function will dispatch mid-ranking delegates to attend the meeting. These are essentially votive lambs. They are offered up to take a beating, if one is needed, without making things worse for those who sent them, so must be resilient not to break down in tears at the first sign of angst, and savvy enough not to throw her superiors under the bus they assuredly deserve to be under. The delegate must “talk to her slides” — though in practice she will understand very little about them — aiming to sound informed enough for her contribution to pass without remark, but not so informed as to prompt questions.
If the Opco chair got out of the wrong side of bed, or should a delegate’s attestations be too anaemic, or not anaemic enough, the chair may snap. She will give the delegate a five-minute shellacking in front of the assembled. This is the modern-day equivalent of a public stoning — not to the death, but “to the pain”: there are three hundred pages to get though, and eighteen risk groups presenting, after all. For most delegates, attendance is a 2-hour-long game of Russian Roulette where there are only a handful of bullets in what is quite a large chamber. Consolation takes the form of the private chat channels, alive with wincing wonderment while eviscerations happen.
In any case, should the opco chair come for you, her question will not be, “where is your risk”, but the far stupider one, “why are you displaying a risk”, as if “risk” is not an immutable function of commercial life. Such grumpiness is outdated in our compulsively empathetic times, and may soon pass into history, the same way bear-baiting, throwing Christians to lions and rucking with your studs all have. We think this is a pity: financial services ought to be a blood-sport: there should be some sense of jeopardy. We lose something important if everyone is kind, respectful of standpoint, mindful of lived experiences and inclined instead to passive-aggressively knifing people in the back in private.
In any case, the Opco will methodically plough through each department’s slides, which all will tell variations of the same story: in the main, plain sailing but, by way of colour, the odd fixable glitch in process — nothing serious; just the inevitable operational snags of modern financial services — and for those, a remediation plan, already in train, for how they will be resolved.
All kinds of metrics will be presented, analysed and set out in voluminous graphs, charts and data tables. There may be a dashboard of “high risk” situations — but only ones numerically derived from metrics. Its RAG array will read, mainly, uniform green. Perhaps the odd amber, for the sake of punctuation, attesting to easily-addressed low-impact hazards to be included “for good order” and with confident assurances there is elevated risk of loss.
It will be like this because we are enculturated to always need to be in control, for all systems to be go, all processes in good standing, all engines ticking over without significant strain. We have been acclimatised to believe that the greatest sin is to disrespect process. If you disrespect process, you can be blamed. If you don’t, you can’t.
But what use is a risk process that is designed to tell you everything is under control?
Did LTCM appear on broker risk reports before it collapsed? Did Amaranth? Did Malachite, or Archegos? We hope the answer here is “no,” because that means there’s a bad apple. If it was “yes,” and no-one intervened, then the system has broken down. And someone is going to get shot.
We can see here how, curiously, how a good bad apple — the kind that is so good that no-one can be blamed for not having noticed it — is, for the prospects of those who manage operating committees, a kind of good apple, in that it presents a pass; an alibi; an excuse for being none the wiser.
What tawdry games we play.
The Opco, reimagined
Forgive us for a moment of science-fiction.[2]
Imagine a different kind of Opco. A fantasy Opco, designed not to protect the posteriors of those at its helm with plausible deniability, but designed to actually look for concealed bad apples.
Have all risk control and business groups discuss these observations together. Do it in person. No decks, no BlackBerries, no-one phoning in. No interruptions. Put on lunch. Open minds. No eviscerations. Open minds. Invite people across the ranks, but leave titles at the door. Everyone should engage. Everyone should contribute.
On this one, the standing agenda is simple: ask each delegate the same set of open questions and throw the answer to the floor: not by way of cross-examination, but by way of open-eared enquiry, to consider things that might have escaped the executive’s attention.
What is on your mind?
What are you worrying most about?
What should we worry most about?
What are we missing?
What are your top five concerns to the stability and profitability of the bank?
Sometimes the most counterintuitive questions might provide food for thought.
Which client is printing the most business?
Who is generating the most revenue?
Who is borrowing the most money?
Who is generating the most commission?
Who diverges most from the pack?
Whose performance seems too good to be true?
Who has the most leverage?
Who has the biggest positions?
Which are the most concentrated names?
Where is the thinnest liquidity?
Whose docs, and margin lockups are the most severe?
Note: a lot of names on the JC’s nascent financial disasters roll of honour would have come up: Archegos. LTCM. Enron. Greensill. Abraaj. Amaranth. Wouldn’t that be a more effective way of surfacing bad apples?
And then he woke up and it was all a dream
Of course, it is preposterous that any self-respecting management committee would work like that. The reasons why are as immutable, and predictable, as they are misplaced. The firm is a self-perpetuating autocracy. You do not run autocracies by empowering subordinates. Who knows what kind of silly things they would say.
Regulators would puke
We have to submit the minutes of our risk meetings to the regulator. They have great powers to demand further information from us. The last thing we want is to have them asking difficult questions. We wish to create the impression of calm, ordered, measured, control. Nothing to see here folks, move along. We can’t afford to give any kind of impression there are things we do not know, things we cannot manage, or things about which we are worried in our business. Encouraging coal-face staff to indulge their paranoid fantasies is the last thing we should do.
There is certainly sense in this, but it insane all the same. The best way of managing our regulator — a body whose existential purpose is to manage risk of catastrophe — is to be wilfully blind to the risk of catastrophe.
Senior management would puke
Even if the regulators would be cool with it — they wouldn’t — the of the Opco chair would not. This is the golden rule of cross-examination: don’t ask questions to which you don’t know the answer.
“Imagine,” she might say, “if someone flagged this kind of crazy risk in a risk meeting, and we discussed it, and we decided to do nothing about it, and then that exact crazy risk happened. Management would be incandescent. We might get disciplined. Or fined. Or even fired.”
Rightly.
Further reading
Sidney Dekker has written persuasively about management’s habit of blaming operator error.[3] The late Charles Perrow wrote brilliantly[4] about system accidents. James C. Scott writes fabulously about the executives blindness to critical informal systems.
There is a compelling case that organisations can do more to empower experts, declutter systems and control mechanisms and delayer middle management to improve practical risk management.
See also
- Human error
- Sidney Dekker’s The Field Guide to Human Error Investigations
- Rumours of our demise are greatly exaggerated
References
- ↑ Though Dan McCrum was subject to a criminal investigation, so he might feel differently about that.
- ↑ Strictly speaking, finance fiction. The genres are related.
- ↑ Sidney Dekker’s The Field Guide to Human Error Investigations
- ↑ Charles Perrow, Normal Accidents: Living with High-Risk Technologies