Why ‘Year-To-Date’ is Rubbish

‘Year-to-date’ figures are often used in performance frameworks, both in the public and private sectors. In policing, ‘year-to-date’ figures are regularly used to track the number of reported crimes at any given point in the year, supposedly as an indicator of whether the police are doing a good job or not. Now, leaving aside the debate about whether crime rates should be used as a performance indicator at all (I might talk about that another time), let’s have a look at why the practice of ‘year-to-date’ itself is rubbish.

Stick Person Goes For a Run

Imagine a runner…

stickman running

Here we are – it’s our little friend, the stick person. Let’s say the stick person likes to run 12 miles at a time, and his personal best is 120 minutes and 1 second; an average of about 10 minutes per mile.

Next time the stick person sets out on a run, he aims to do his very best and see how quickly he can run the 12 miles. He knows that multiple factors will affect his performance, such as his fitness levels, his current weight, his choice of clothing, his diet, and so on.

He also knows that other, external factors will affect his performance; for example, the outside temperature, the wind, the terrain and so on. He might be held up for a few seconds waiting to cross a road. His shoelace might come undone. Something else might happen outside of his direct control that affects his final time.

So, our stickperson sets out to maximise his chances of a fast time by ensuring the systems conditions he can influence are favourable. He trains. He doesn’t run in a bulky duffle coat. He avoids drinking 10 pints of Guinness and eating a massive curry the night before his run.

stick person on beer

Anyway, once he sets off on his run, the stick person is smart enough to measure progress, because he knows measuring stuff is vital. He wears a heart rate monitor which helps him check if his heart rate is within a normal and safe range. He checks his stopwatch every so often to gauge progress. He processes this information as he runs along, taking into account the context around him, knowing that if his heart rate gets too high, he will have to slow down a little; likewise, he knows that when he’s running along an uphill section of the route, he is likely to cover ground a bit slower.

The stick person uses all this information to ensure he is doing his best at all times. He knows that some miles will be faster than others, but is not unduly concerned because he understands this is normal. It might be that he beats his personal best this time, or it might be that he’s just a few seconds too slow. Either way, his objective is to continually improve.

Now, imagine he adopted the ‘year-to-date’ method to pace himself – a strict 10 minutes per mile. Of course, this ignores all the factors that can influence his speed at any given time. So what happens? Well, he completes the first mile in 9 minutes 55 seconds, giving him 5 seconds ‘in the bank’. Unfortunately, the second mile is partly uphill and it takes him 10 minutes and 20 seconds, causing an overall ‘deficit’ of 15 seconds to that point.

stick person failing

Now the pressure is on, so he speeds up a bit, but realising he’s still a bit behind time, he decides to sprint the last couple of hundred yards of the next mile. This makes him feel tired, but at least he makes up some time. This process repeats itself as he focuses on each individual mile, until he collapses at the side of the road, exhausted. Poor stick person.

Clearly, no self-respecting runner would prefer that method over the stick person’s original approach. But wait! Bizarrely, the equivalent of worrying about individual mile timings (and sudden sprinting) is prevalent in many performance management situations, as we shall now see…

The Trouble With ‘Year-To-Date’

The problems with ‘year-to-date’ are many, especially when today’s figure is compared to:

  • The average.
  • The previous year’s figure (or an aggregation of previous years’ figures).
  • An arbitrary numerical target.

Have a look at the table below –

Year to date table

Here we can see two performance years that ended neck-and-neck. (It doesn’t matter what the numbers relate to). Firstly, imagine the reaction each month as management compare the ‘year-to-date’ figure with the monthly average required to finish the year ‘on track’ – Cue a mix of concern/anger/confusion (when it’s higher), and feelings of success and self-congratulation (when it’s lower). All of this, as you can see, is a big waste of time because looking at the whole year in retrospect, both rows come in at 480 anyway.

This occurs because of normal variation – the fluctuations amongst the numbers are caused by all those internal and external factors that affect how the system performs, as in the case of the stick person. As you can see, variation even applies to systems or processes that are stable. Therefore, there is no point whatsoever in getting excited about whether a number is a bit higher or lower than the average at any given point in time.

“About half of everything is below average”, as I like to say…Hahaha! 😉

The misguided belief that some meaning can be ascribed to the types of fluctuations I’ve just talked about leads to exhortations such as, “We cannot afford to record more than 135 crimes per day”, or “Sales must exceed £150,000 per week”, and so on. It causes people to withhold surplus units of whatever’s being measured until the next period. It causes under-recording and other bizarre practices designed to keep the numbers under control. This is where our stick person disregards his knowledge about his surroundings and begins to run flat out.

And that’s just comparing the ‘year-to-date’ figure against averages…that’s bad enough, but check this out – what happens when you compare it against last year’s ‘year-to-date’ figures? I’ll tell you – it gets worse!

Stick person chart

This is because – guess what – last year’s figures were subject to variation too! The numbers went up and down. Crime, sales figures, unemployment rates, you name it – none of them happened in a nice flat line. We have ZIG ZAGS, people; ZIG ZAGS! However, do not be alarmed – this is just normal variation again. So, when we try and compare this year’s ‘year-to-date’ figure against last year’s this is even dafter than making a comparison with the average because we are comparing two moving variables. Cue wider fluctuations and more panic…

And it’s all so meaningless. As you can see from the table, both years came in at 480 anyway. Imagine how quickly our stick person would burn out if he adopted this method of measuring his performance as he runs along.

Finally, we consider the comparison between the ‘year-to-date’ figure and an arbitrary numerical target – in this case a nice 10% reduction. As you can see from the table, the target was only achieved during two months. This is because someone invented it in their head, without having any understanding of the systems conditions likely to influence performance. It’s just like our stickperson suddenly setting himself a target to run 9-minute miles, when he has never run faster than 10-minute miles. Targets do not provide a method for achieving stated aims.

stick person targets

Oh, and that would just make our poor stick person collapse at the side of the road even sooner. Poor stick person.


All of these ‘year-to-date’ methods are incapable of telling you anything about performance. FACT.

Furthermore, they are all quite capable of inducing dysfunctional behaviour, as people mistakenly assume there must be a meaning for the apparent differences between the numbers (caused by normal variation), then change tactics to try and get the ‘year-to-date’ figure on the preferred side of whatever number it is being compared against.

‘Year-to-date’ obscures genuine trends when they do exist, causes false signals and mistaken assumptions, makes people ask the wrong questions about the wrong things, causes unfair blame and arbitrary praise, leads to short-termism, knee-jerking and a fixation on today’s isolated number at the expense of understanding what the actual influencing factors are. Oh, and you may have noticed – the whole approach is based on making binary comparisons, which are known to be very rubbish indeed.

So, if you use ‘year-to-date’ in your performance framework, do yourself a favour and ditch it immediately, then go out and do something useful with your data instead.

Take some tips from the stick person!

stick person thumbs up

About InspGuilfoyle

I am a serving Police Inspector and systems thinker. I am passionate about doing the right thing in policing. I dislike numerical targets and unnecessary bureaucracy.
This entry was posted in Systems thinking and tagged , , , , , , , , , , . Bookmark the permalink.

13 Responses to Why ‘Year-To-Date’ is Rubbish

  1. nosapience says:

    Reblogged this on Nosapience's Blog and commented:
    First of the year and on top form as usual.

  2. Dave Hasney says:

    Reblogged this on Dave's Bankside Babble and commented:
    Another fine example pf why current ‘performance’ management is failing in our public services…

  3. Susan Coleman says:

    Obligatory reading for public sector – and less time worrying , more time doing.

  4. Pingback: Why ‘Year-To-Date’ is Rubbish | Pol...

  5. dictadicit says:

    “About half of everything is below average”
    Actually, this is mathematically untrue, as a simple example will show:

    Let a set S = {1, 1, 1, 1, 1, 1, 1, 1, 1, 21}
    Clearly, the average is (9*1+21)/10 = 3.
    So, 90% of S are below average!

    • Gold star to you my friend! Of course, you’re absolutely right. I should have said “About half of everything is below average where Gaussian distribution applies”. It just didn’t seem as catchy..

      • Alex Ray says:

        Well, strictly speaking, there are three standard types of average… The mean (which is what is generally termed as the average – summ and divide by number of), the median (which is exactly halfway by definition), and the mode (which is the most frequent). As mentioned above, only for a perfect ‘normal’ distribution will all three be the same. handy tip – Look at the distributions and see. If it isn’t bell shaped, beware! (I will spare you, for now, a journey around Anscombe’s Quartet and the Central Limit Theorem….)

      • But ‘About half of everything is below the mean average where Gaussian distribution applies’ is an even worse catchphrase! Stop ruining my slogans. And no, I don’t want to listen to the music of Anscombe’s quartet, whoever they are…

  6. I love “half of everything is below average” I need to use that and look clever haha!

  7. Paul says:

    The other definition I use is that the average means you are wrong half the time

  8. Pingback: Pigsaw Blog » Blog Archive » Bookmarks for 10 Jan 2014

  9. forelorn says:

    The reality of this is that senior police officers who dont know or understand complex mathematical theory need to be convinced. Therefore it has to be simple. Having myself been involved in the police performance world and had to convince senior managers to implement a new strategic performance methodology I don’t underestimate that challenge. If you want to test it for yourself try finding a senior officer and attempting to convince them of the validity of sampling theory (eg sampling a population of 650000 by carrying out 384 surveys which is has a confidence level of 95% confidence interval 5%)

  10. Pingback: Straight Lines | InspGuilfoyle

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s