A few things happened yesterday. Firstly I realised I hadn’t written a blog for a while. (This isn’t because I’ve run out of things to say, just because I’ve been busier than usual recently). The second thing was that I discovered my old Lego police station set, retrieved from my Mum’s loft following her house move. The police station has nothing to do with this blog, but I am so impressed with it that I feel obliged to post a picture here. Its discovery also prompted much merriment on Twitter, which made me smile.
The third thing was that whilst absent-mindedly flicking through last night’s television I came across a ‘reality’ programme about a major British company (which I won’t name for legal reasons). The scene featured an angry manager holding a performance meeting with a room full of his staff, pointing at charts and data comparing this month against last month, and shouting “Worse!”, Worse!”, “Worse!”, as he reviewed each category. The staff members sat silently; one or two squirmed.
The combination of witnessing this scene, along with the fact I haven’t posted anything here for a while was the catalyst for this blog.
Anyone who is familiar with my previous posts will be aware of my views on performance management and systems. I have railed against organisational waste, ranted about the perverse incentives and behaviours generated by numerical targets, and tried to explain a little bit about variation and the intelligent application of Statistical Process Control (SPC) charts. I’m going to try not to cover old ground, but just wanted to reflect on the particular folly of the manager’s approach that I witnessed last night.
To begin with, he was having a go at the wrong people. Many of you will know that Deming attributes about 94% of performance to the system. The workers operate within the constraints that the system imposes on them, and it is management’s responsibility to improve the system. Only management can work on the causes of failure; simply shouting at the workers or exhorting them to work harder will not change anything about the capabilities of the system. A further irony is that this particular manager had based his assumptions of poor performance on incomplete information, so may well have been haranguing his audience during times of outstanding performance.
Putting aside the rights and wrongs of the management style observed, the point I’m making is really about the approach of comparing this month to last month (or to this time last year) – It’s meaningless. A binary comaprison between two numbers can never tell the whole picture. If managers could understand just this one point, it would prevent unnecessary berating of the workers, knee-jerk reactions, and other waste activity. It would also enrich organisational learning and foster an enhanced understanding of performance. What manager wouldn’t want that?
To illustrate this point, I would like you to put yourself in the position of the manager. You want to know about how your organisation is performing. This will enable you to make systemic adjustments, identify any genuine poor performance, and predict where performance is heading. I will present you with three charts to assist you. To keep it simple I’ve even taken out the control limits (see my other blog on SPC charts for more information about these). The data are unimportant – the charts could represent sales, crime rates, customer satisfaction ratings etc.
The first chart shows performance as compared to last month.:
Useful?
Well, no it isn’t really is it? It doesn’t tell you anything about performance over the last year, or provide information that may assist in predicting future performance. It doesn’t tell you anything about whether performance is improving, getting worse, or if the system is stable. And of course, last month’s data point is itself subject to variation, so why would anyone think it’s meaningful to use it as a comparison? If it feels like there is something missing, then there probably is, as in this case. Why would some managers therefore choose to rely on this method?
The good news is that when simplistic binary comparisons are presented in this format, the gaps become blindingly obvious. Even if you stop reading this now, you’ve still attained a more advanced level of understanding than many out there. Use it to your advantage!
Next chart then: this month compared to this time last year. Surely, a comparison against this time last year is much more meaningful? That would take seasonality into account and provide a long-term picture, wouldn’t it? Let’s see…
What does this chart tell you? Not a lot. What’s happened during the intervening period? No one knows. Performance may have been on a steady decline, or this month’s figure may be abnormally high or low. The same applies to last year’s figure. Both are subject to natural variation. In the mind of our manager from the television programme, this binary comparison would represent a terrible decline in performance. Even the slight difference between the two amounts on the ‘This Month vs Last Month’ chart would be taken as a failure. But is it, and can this information be used to understand, predict or improve performance for the organisation? Of course not!
Finally, let’s look at a chart that includes all the data:
At last – the full picture. The chart demonstrates that the system is stable. This time last year the data point was unusually high, meaning that something may have happened that affected performance. It certainly means that it is misleading to rely upon this point to make a comparison with other months. Furthermore, there is no credence in judging this month’s performance against last month’s (thereby interpreting the slight change as deterioration), as it is clear that long term levels are stable. If the management want to improve performance, then action must be taken on the system; otherwise it is highly likely that performance will be maintained at current levels well into the future. It would be impossible to predict this using either of the first two charts.
To summarise then…
‘This week/month/year’ vs ‘Last week/month/year’ comparisons are commonplace in both the public and private sectors, and many managers still rely on them to judge performance. They appear to offer the comfort of a simple and clear comparison that shows which direction performance is heading, but as I hope I have demonstrated, they are misleading and can only generate harmful reactions. Binary comparisons never tell the truth: the only possible interpretations are either ‘we are doing fine’, inviting complacency when the long term picture may indicate that urgent action is required on the system, or ‘things are getting worse’, (even when it isn’t) which risks unnecessary organisational responses to fix the ‘problem’.
When you compare the charts above, I hope you can see how foolish that manager was to berate his staff based on his company’s ‘this month vs last month’ performance data. If you are a manager, please try looking at the fuller picture of performance instead of ‘this week vs last week’, ‘this month vs last month’ or ‘compared to this time last year’. It will improve your organisation’s understanding of performance, and better still, actual performance.
A very interesting and entertaining read. We sing from the same hymn sheet (so to speak). My thesis focuses on how quantitaive performance indicators prevent the introduction of quality service in policing e.g. Savage and Leishman (1996) who state that the fundamental pursuit of quantitative performance measures may detract from the service dimension of policing, and encourage an over-emphasis on crime fighting. I would be interested in your reference to Demming. Do you have the source? Great read – thanks.
Hi – the 94% reference is from: Deming, W. E. (1986) Out of the Crisis. Cambridge: MIT Press p.315. Good book. Gets a bit heavy with indecipherable algebra towards the end but I’d still recommend it!
So simple yet so misunderstood, my take is that many managers adopt the more simplistic (wrong) methods, simply to provide ‘evidence’ of action as a ‘manager’. Often action that is neither required or effective in the long term, save for hacking off those already working flat out. And as you say, more often than not it is the system which is broken, not the output. Failed brewery trips come to mind!
Pingback: Dilbert’s and Inter-Continental Balistic Dummies! « The Bankside Babble
Brilliant! But dear GOD do managers not like these things. They call these, and approaches like these academic, theoretical. For rocket scientists and nerds. Amazingly simple to use, yet…I think it doesn’t appeal to the angry idiotic manager you describe in the TV program because it has nothing to tell him he doesn’t (think) he already knows. It’s gone down! it’s gone up! Stayed the same? Well that’s a differentiation too far. To the tough-minded horny handed man of action, they need an excuse for action, not a tool for thinking. Oddly, they never realise that management is not a physical job. It isn’t digging ditches, so if you aren’t paid for brawn it musty be brain, yet brain is the last thing to be used. That’s for girly swots.
Pingback: Bean Counting delivers Police Ephemeralization? « The Bankside Babble