“Panic on the streets of London! Panic on the streets of Birmingham!”
So says the song of the same title by UK band The Smiths, anyway. The true story that follows (yes, I’ve caved in to popular demand and presented it as one of my terrible drawings) demonstrates how well-meaning, but ultimately poorly-executed use of performance data can lead to widespread panic and counterproductive knee-jerk reactions.
Starting at the top left, work your way through this little beauty…
What happened there then?
Well, the helpful analysts (who meant well) managed to initiate a sequence of increasingly panicked events because they aggregated a bunch of data together at a high level and presented it in a format that makes it impossible to interpret meaningfully. Despite the opaque nature of what was churned out, no one questioned it, and senior managers then drew conclusions based on assumptions about stuff that wasn’t actually there.
Next, the assessment that a particular type of crime increased during one month last year (compared to the previous month – a binary comparison – Nooooo!!!) was disseminated with great urgency to divisional management teams. Faced with such apparently unequivocal evidence that something would probably go wrong again this year, panic sets in and we see more knee-jerking than a row of Can-Can dancers.
A predictable battery of tactics – written plans, resources being shifted about at the drop of a hat, more meetings etc – are unleashed, until one bright spark, quite far down the organisational food chain, decided to look at the data in context using control charts. And you know what..?
There were NO signals.
Panic over everyone!
So… points to take away from this story:
- Good intentions are not enough.
- Good people naturally want to do something to tackle perceived problems, but sometimes it’s best not to react.
- Only ever use data in context to establish an evidence base for action.
- Don’t panic!
As Deming used to like saying, “In God we trust. All others bring data”.
Oh, and the ‘bright spark’ I referred to wasn’t me by the way.
Notes (Added 16th April 2013)
Now, I am not going to make a habit of qualifying my ramblings with disclaimers that are longer than the actual posts, but just to ensure that no one gets the wrong impression about what I’m saying in this one, or why, here’s a couple of things I want people to be clear about:
1. The post is about the unintended consequences that can occur when managers draw erroneous conclusions about data. It is definitely not a pop at analysts. I know of many analysts who routinely use control charts, as well as others who would relish the opportunity to use them more often.
2 Apparently, there was also “…panic on the streets of Carlisle, Dublin, Dundee, Humberside.” This is not to suggest that the story in this post necessarily emanated from London, Birmingham, Carlisle, Dublin, Dundee, or Humberside. They’re just lyrics from an 80s pop song.
Not sure about that, I think i can see some meaningful cyclic patterns in that last ‘random’ long-term graph… Better panic again..
Good spot Joe. Entirely down to my thoughtless freehand drawing! Having said that, the human brain does like to look for patterns in things, and although the peaks and troughs do appear to be cyclical, there are no recognised signals amidst the data. That’s my excuse anyway.. 😉
Pingback: More #Police Amalgamations Imminent? | Dave's Bankside Babble
Very good… I like the pictorial representation.
I have to disagree on one point though – the “lowly bright spark” probably thought that the original conclusion was flawed in the first instance and probably advised against any action way back then. But was ignored. And continued to be ignored throughout the development of action plans, crisis meetings, resource re-allocation activity, etc. until the management team were so far down this route that to admit they didn’t need to do any of it would have been an admission of failure… so eventually the grandly named Operation Panic was left to die quietly rather than provide any real learning.
Pingback: Panic! | Policing news | Scoop.it
A timely reminder of the police’s tendency to over-interpret data.
I think this comes mostly from pressure from above, where strategic trends can look compelling, crime looks predictable and there is an understandable desire to get most ‘bang for the buck’ by preventing/detecting repeat offences.
The reality is that strategic trends are the aggregation of local incidents which for short-term tactical purposes, are little better than random. Prof Spiegelhalter of Cambs summarised the situation as ‘the overall pattern of events can often be predicted surprisingly well, but not the detail……..’ The vast majority of the time, hotspots are at best lukewarm and so-called ‘red’ routes are akin to being white with a delicate shade of pink.
This invariably leaves the patrol sgt/Insp in a situation where their reactive resources are under great pressure, whilst any proactive capacity is unlikely to catch the burglar or the car thief at work (though they probably make themselves useful in all sorts of other ways). Therefore, the temptation to switch them to reactive work has a strong basis in logic.
The inconvenient truth is that for tactical purposes, crime is far less highly patterned than everyone would like to think. The net result is that all sorts of activity is generated for the wrong reasons.
My concern is that the HMIC’s recent speech about crime prevention and ‘bang for buck’, indicates that he too is about to join in the self-deception and jump into this elephant trap.
On the other hand, perhaps he will read this blog………