‘Year-to-date’ figures are often used in performance frameworks, both in the public and private sectors. In policing, ‘year-to-date’ figures are regularly used to track the number of reported crimes at any given point in the year, supposedly as an indicator of whether the police are doing a good job or not. Now, leaving aside the debate about whether crime rates should be used as a performance indicator at all (I might talk about that another time), let’s have a look at why the practice of ‘year-to-date’ itself is rubbish.
Stick Person Goes For a Run
Imagine a runner…
Here we are – it’s our little friend, the stick person. Let’s say the stick person likes to run 12 miles at a time, and his personal best is 120 minutes and 1 second; an average of about 10 minutes per mile.
Next time the stick person sets out on a run, he aims to do his very best and see how quickly he can run the 12 miles. He knows that multiple factors will affect his performance, such as his fitness levels, his current weight, his choice of clothing, his diet, and so on.
He also knows that other, external factors will affect his performance; for example, the outside temperature, the wind, the terrain and so on. He might be held up for a few seconds waiting to cross a road. His shoelace might come undone. Something else might happen outside of his direct control that affects his final time.
So, our stickperson sets out to maximise his chances of a fast time by ensuring the systems conditions he can influence are favourable. He trains. He doesn’t run in a bulky duffle coat. He avoids drinking 10 pints of Guinness and eating a massive curry the night before his run.
Anyway, once he sets off on his run, the stick person is smart enough to measure progress, because he knows measuring stuff is vital. He wears a heart rate monitor which helps him check if his heart rate is within a normal and safe range. He checks his stopwatch every so often to gauge progress. He processes this information as he runs along, taking into account the context around him, knowing that if his heart rate gets too high, he will have to slow down a little; likewise, he knows that when he’s running along an uphill section of the route, he is likely to cover ground a bit slower.
The stick person uses all this information to ensure he is doing his best at all times. He knows that some miles will be faster than others, but is not unduly concerned because he understands this is normal. It might be that he beats his personal best this time, or it might be that he’s just a few seconds too slow. Either way, his objective is to continually improve.
Now, imagine he adopted the ‘year-to-date’ method to pace himself - a strict 10 minutes per mile. Of course, this ignores all the factors that can influence his speed at any given time. So what happens? Well, he completes the first mile in 9 minutes 55 seconds, giving him 5 seconds ‘in the bank’. Unfortunately, the second mile is partly uphill and it takes him 10 minutes and 20 seconds, causing an overall ‘deficit’ of 15 seconds to that point.
Now the pressure is on, so he speeds up a bit, but realising he’s still a bit behind time, he decides to sprint the last couple of hundred yards of the next mile. This makes him feel tired, but at least he makes up some time. This process repeats itself as he focuses on each individual mile, until he collapses at the side of the road, exhausted. Poor stick person.
Clearly, no self-respecting runner would prefer that method over the stick person’s original approach. But wait! Bizarrely, the equivalent of worrying about individual mile timings (and sudden sprinting) is prevalent in many performance management situations, as we shall now see…
The Trouble With ‘Year-To-Date’
The problems with ‘year-to-date’ are many, especially when today’s figure is compared to:
- The average.
- The previous year’s figure (or an aggregation of previous years’ figures).
- An arbitrary numerical target.
Have a look at the table below -
Here we can see two performance years that ended neck-and-neck. (It doesn’t matter what the numbers relate to). Firstly, imagine the reaction each month as management compare the ‘year-to-date’ figure with the monthly average required to finish the year ‘on track’ - Cue a mix of concern/anger/confusion (when it’s higher), and feelings of success and self-congratulation (when it’s lower). All of this, as you can see, is a big waste of time because looking at the whole year in retrospect, both rows come in at 480 anyway.
This occurs because of normal variation – the fluctuations amongst the numbers are caused by all those internal and external factors that affect how the system performs, as in the case of the stick person. As you can see, variation even applies to systems or processes that are stable. Therefore, there is no point whatsoever in getting excited about whether a number is a bit higher or lower than the average at any given point in time.
“About half of everything is below average”, as I like to say…Hahaha!
The misguided belief that some meaning can be ascribed to the types of fluctuations I’ve just talked about leads to exhortations such as, “We cannot afford to record more than 135 crimes per day”, or “Sales must exceed £150,000 per week”, and so on. It causes people to withhold surplus units of whatever’s being measured until the next period. It causes under-recording and other bizarre practices designed to keep the numbers under control. This is where our stick person disregards his knowledge about his surroundings and begins to run flat out.
And that’s just comparing the ’year-to-date’ figure against averages…that’s bad enough, but check this out – what happens when you compare it against last year’s ‘year-to-date’ figures? I’ll tell you – it gets worse!
This is because – guess what – last year’s figures were subject to variation too! The numbers went up and down. Crime, sales figures, unemployment rates, you name it – none of them happened in a nice flat line. We have ZIG ZAGS, people; ZIG ZAGS! However, do not be alarmed – this is just normal variation again. So, when we try and compare this year’s ‘year-to-date’ figure against last year’s this is even dafter than making a comparison with the average because we are comparing two moving variables. Cue wider fluctuations and more panic…
And it’s all so meaningless. As you can see from the table, both years came in at 480 anyway. Imagine how quickly our stick person would burn out if he adopted this method of measuring his performance as he runs along.
Finally, we consider the comparison between the ‘year-to-date’ figure and an arbitrary numerical target – in this case a nice 10% reduction. As you can see from the table, the target was only achieved during two months. This is because someone invented it in their head, without having any understanding of the systems conditions likely to influence performance. It’s just like our stickperson suddenly setting himself a target to run 9-minute miles, when he has never run faster than 10-minute miles. Targets do not provide a method for achieving stated aims.
Oh, and that would just make our poor stick person collapse at the side of the road even sooner. Poor stick person.
All of these ‘year-to-date’ methods are incapable of telling you anything about performance. FACT.
Furthermore, they are all quite capable of inducing dysfunctional behaviour, as people mistakenly assume there must be a meaning for the apparent differences between the numbers (caused by normal variation), then change tactics to try and get the ‘year-to-date’ figure on the preferred side of whatever number it is being compared against.
‘Year-to-date’ obscures genuine trends when they do exist, causes false signals and mistaken assumptions, makes people ask the wrong questions about the wrong things, causes unfair blame and arbitrary praise, leads to short-termism, knee-jerking and a fixation on today’s isolated number at the expense of understanding what the actual influencing factors are. Oh, and you may have noticed - the whole approach is based on making binary comparisons, which are known to be very rubbish indeed.
So, if you use ‘year-to-date’ in your performance framework, do yourself a favour and ditch it immediately, then go out and do something useful with your data instead.
Take some tips from the stick person!