False Assurance: how ICAEW’s new film invites us to think about human factors in auditing

I had the privilege of being invited to speak at ICAEW’s premiere of False Assurance “An exciting film drama created to provoke discussions on how accountants, auditors and company directors should act when faced with difficult situations.”  Here is a slightly extended version of the speech I gave.

It is a privilege and a pleasure to be given the opportunity to share some personal reflections on False Assurance. It is really excellent – I love it – and I think that Duncan and the Professional Standards team should feel really proud not only of the content but also of the production values.

I first watched the film four weeks ago and I have to say I had an almost visceral reaction to it.  It was very uncomfortable to watch.  The scenarios were so plausible.  Also, I have to confess to being a bit of an aficionado of those classic public information films from the 1970s – you know, the ones that dole out disfigurement and death to drink-drivers, children trespassing on railway lines and women running in the street.  This film is like one of those: it builds up a sensation of mounting dread.  You know something bad is going to happen to these nice people, but what? And to whom?  Here, the answer might as well be: everything that possibly could, and to everyone.

That’s the beauty of it.   The scenario that is developed is one in which there are a number of factors that all contribute to corruption and fraud going undetected for some time.  None of the characters are unbelievably good, and none unbelievably bad – all of them succumb to pressures that we see in real life in one form or another.

I’ve worked in both Professional Practice and Audit Quality for a number of years now, so I’m particularly interested in how the auditors in the film behave and why – and how we should respond to that. In my experience, audit firms tend to take what we might call a “person approach” to dealing with quality issues. Poor decisions are seen as arising primarily from flaws in an individual person’s mental processes such as forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness.

When we try to eliminate individual weaknesses, the sort of measures we put in place are directed mainly at reducing unwanted variability in human behaviour.  It’s a regulatory compliance approach.  So we ask for more procedures and more checklists; we design disciplinary measures that appeal to fear – if not fear of litigation, fear of sanctions – naming, blaming, shaming and, these days, fining.  There’s an uncomfortable implied moral subtext to this approach in that it seems to inherently assume that bad things happen to bad people.

The film instead highlights the value of what we might call the “system model”.  In this model for understanding failure, human errors are seen as inevitable products of systemic weakness. We can’t change the human condition, so we have to change the conditions in which humans operate.

An audit team is a system of defensive layers – like the “Swiss Cheese” model proposed by James Reason, Professor of Psychology at Manchester University[1]. There are holes continually opening, shutting, and shifting in each slice of cheese. The presence of holes in any one “slice” does not normally cause a bad outcome. Usually, this can happen only when the holes in many layers momentarily line up, as in the film, where there are multiple opportunities for the fraud to be identified, and multiple failures – some individually minor – are required for it to go undetected.

In the film, you see the individual active failures – poor decisions made by each character – but you also observe the latent conditions that increase the possibility of poor decision-making.   Professor Reason uses the analogy of mosquitos for active failures, versus mosquito breeding grounds for latent conditions.  You can swat all the mosquitos you want, but if you don’t drain the swamp, they’ll keep coming – and you’ll have to keep swatting.  In the film, these swampy conditions include overwork, time pressure, a culture of rewarding strong relationships with client executives and the sort of hierarchy where none of the senior people seem to seriously entertain the possibility that the concerns of more junior members of the team might ever amount to much.

I want to make particular mention too of the way the CFO in the film plays on institutional sexism by criticising the female audit partner for “interrogating” him.  Research at Stanford University is ongoing but shows that women receive 2.5 times the amount of feedback that men do about aggressive communication styles. Another study found that negative personality criticism showed up thirty times as frequently in appraisals of women as in appraisals of men, though the population selected for that review had all been considered to be equally strong performers. The women were much more likely to be described as “abrasive”, “coming on strong”, “strident” or “aggressive”.

So one of the latent conditions in our profession is a particular disadvantage to women.  It seems women are much more likely to be criticised for the robust challenge, persistence and scepticism that would be praised in a male colleague.

So why does the idea of personal responsibility for failure persist? Well for one, we tend to prefer it – it resonates with our ideas of responsibility and accountability.  It’s much easier to sanction a person than to change the culture that fostered that person’s mistakes. And sadly, we’re all human and we find blaming individuals emotionally satisfying.

We also like the idea of single causes because we are afraid of risks we can’t control. Sidney Dekker, Professor of Safety Science at Griffith University, Australia says “The failures which emerge from normal everyday systems interactions question what ‘‘normal’’ is. It is this threat to our epistemological and moral accountancy that makes accidents of this kind so problematic. We would much rather see failure as something that can be traced back to a single source, a single individual. When this is not possible in the assignation of blame and responsibility, accuracy or fairness matters less than closing or reducing the anxiety associated with having no cause at all. In the Western scheme of things, being afraid is worse than being wrong, being fair is less important than being appeased. Finding a scapegoat is a small price to pay to maintain the illusion that we actually know how a world full of risk works.”[2]

So what do we do?  We need a reporting culture and we need safe spaces to analyse what is reported.  Without a detailed analysis of mishaps, incidents and near misses we have no way of uncovering recurrent error traps or of knowing where the “edge” is until we fall over it.  Both Reason and Dekker refer in their work to “Just Culture”, in particular restorative Just Culture rather than retributive Just Culture.

A Just Culture is one with a vital “collective understanding of where the line should be drawn between blameless and blameworthy actions” (Reason).  It’s a culture that learns and prevents by asking why it made sense at the time for highly intelligent, highly educated, highly trained and highly regulated professionals to do what they did. How many audit firms are really asking that question about failures?

I am hoping therefore that no-one is going to leave the film thinking that the next step is to warn audit partners about that Bad Things will happen to them if they don’t get written representations about related parties.  Let’s instead look for our swamps and set about draining them.  In the context of the film that might include:

  • Looking at how complaints to “relationship” partners about audit team members are handled
  • An honest look at whether and how individual patronage plays a part in promotion processes and the allocation of valuable work within firms
  • Examining the trends/differences in language used in performance appraisals to describe certain behaviours when shown by women or men.

Those are just a few suggestions – there are many other areas to consider.

As Professor Reason says “Perhaps the most important distinguishing feature of high reliability organisations is their collective preoccupation with the possibility of failure. They expect to make errors and train their workforce to recognise and recover them. They continually rehearse familiar scenarios of failure and strive hard to imagine novel ones. Instead of isolating failures, they generalise them. Instead of making local repairs, they look for system reforms.”

I would like to say that I work in a High Reliability Organisation.  But are we audit firms prepared to turn that unflinching scrutiny on ourselves?

[1] Human error, models and management (Reason) BMJ. 2000 March 18; 320 (7237): 768–770

[2] Cognitive engineering and the moral theology and witchcraft of cause (Dekker, Nyce) 2011

%d bloggers like this: