Policy

From The Jolly Contrarian
Revision as of 16:07, 31 October 2019 by Amwelladmin (talk | contribs)
Jump to navigation Jump to search
The Jolly Contrarian’s Glossary
The snippy guide to financial services lingo.™
Index — Click the ᐅ to expand:
Tell me more
Sign up for our newsletter — or just get in touch: for ½ a weekly 🍺 you get to consult JC. Ask about it here.
“Many policies are organizational scar tissue — codified overreactions to situations that are unlikely to happen again”.
- Jason Fried

Policy is organizational scar tissue[1]. It's the sheep they’ll hang you for. It is the dominant ideology of modern management theory. Policy, and process, is seen as practically inviolate, or immovable.

Management orthodoxy is predicated on policy and process being the the fundamental layer of organisational competence. So, for example, a root cause analysis using the 5 why's method is intended to reveal as the root cause the policy which had not been complied with.

Policy is the mountain; the workers are Mohammed. So calling out substandard performance in the workforce is orthodox business management practice. But calling out substandard process or, heaven forfend, policy, is a kind of sedition.

But policy is a proxy. It is a second order derivative of the intractably complex life of a modern organisation. Compliance with policy is the quantifiable thing that an internal audit department can glom onto. It requires no qualitative assessment, no subject matter expertise and no judgement. "It says here you must do this. Did you do this?"

All this assumes that the commercial landscape your policy is meant to cover is fully mapped. It is arable land; a fully scoped production line where all inputs, all outputs, and all contingencies are mapped. No frontiers, no known unknowns are in sight.

Here a policy is prudent, but - in these artificially intelligent times - it too can be coded and automated. there is little here for internal audit to see, largely because processes of this kind are by their nature trivial and of limited value. Where there is no risk, there is no reward.

And therein lies the rub. Any fully automated, algorithmic process within an organisation is necessarily one of limited risk and limited value.

Where do significant risks exist? At the frontiers. in the wild west. Beyond the pale. Beyond the comfort of a fully worked.out algorithm. Where "here be dragons". Where there is maximum opportunity to add value. In a commercial context, to make or lose money. In a a government context 2 to save lives. And where there be dragons, a policy can be your worst enemy.

To override a policy is to threaten the integrity of the organisation. To subvert its governance. It isto break a rule. In the ordinary course, failure to follow a policy will be immediately censored by the bureaucrats from internal audit, and an appeal to the substance of the matter will likely fall on deaf ears. Therefore, any employee faced with a situation to which a policy applies is highly unlikely to be prepared to override it.

We are all familiar with the ghastly tale of Grenfell Tower and the now infamous "stay put policy".

The stay put policy had been established many years previously situations of communal living, and was predicated on certain assumptions about the construction of the building and the nature of the risk. stay put policy was designed to save lives of residents in a tower block who were notionally unaffected by the immediate fire. Therefore, overriding that policy necessarily involved taking some risk. That is why the policy was their current precisely to avoid taking that risk.

it is easy to be wise with hindsight, but put yourself in the position of the fire personnel on the ground at Grenfell as the situation was unfolding. It unfolded very very quickly. at the time, no one knew what the outcome would be. No one knew that 70 people would die. What the personnel did know is that there was a stay put policy in force.

we also cannot know what would have happened had the same foot policy been overridden. Certainly the personnel on the ground at the time could not know that. It seems likely that it would have saved lives. But had it not: what then would the culpability have been of the the personnel making the decision to deviate from a policy?

See also

References