In case you don’t already know, last Sunday Andrew Neil interviewed Ed Davey on the Sunday Politics show. Andrew has come under quite a lot of criticism for using what many feel are typical skeptic (denier?) arguments, when conducting this interview. In particular, he showed the following graphic. What Andrew was arguing was that this graphic shows that even though CO2 concentrations have continued to rise, temperatures – since about 2000 – have not and may actually have fallen.
The first thing to say is that even if the above graphic is a correct representation of the changes in CO2 and temperature it may still not be particularly relevant. The temperature is influenced both by a long-term rising trend due to increasing atmospheric CO2 concentrations, and by natural variations, that can quite easily mask this rising trend over periods of a decade or so. However, one issue with the above graphic is that noone is quite sure what data was used to produce the temperature line. It’s clearly smoothed, but it’s not quite clear how this smoothing was done.
Andrew Neil claims that this is data from the Met Office, and seems to think that this alone is justification for his interpretation of the data. The only data that I’m aware of that is similar is their decadally smoothed data. The data value for each year is produced by averaging the data over a 21 year period (10 years before the year of interest, the year of interest, and 10 years after the year of interest). For example, the value for 1980 is produced by averaging the values from 1970 to 1990. The problem though is that once you get past 2003, there isn’t 10 years worth of data after the year of interest. What the Met office does is to simply use data up until 2013. This means that as you get closer to 2013, the average will be biased by data from the earlier time interval. This means that even if there was a continually rising trend, the decadally smoothed values for the period 2003 to 2013 would appear to show a slowdown.
I’ve produced a figure below that illustrates exactly this. I’ve produced fake temperature anomaly data, for the period 1970 to 2013, assuming a trend of 0.1oC per decade. My fake data therefore continues to rise right up until 2013. This is shown by the solid line. I then smooth this using the 21 year smoothing used by the Met Office, but obviously (as they do too) truncating the smoothing at 2013 as I have no data past this date. This is dashed line in the figure below. Even though the raw data rises with the same trend throughout, the decadally smoothed data shows a slowdown after 2003, simply due to the averaging procedure.
Now I should make it clear that I’m not claiming that the slowdown during the 2000s is simply due to the averaging procedure. It’s clear that looking at monthly or annually averaged data, there has been a slowdown in the last decade or so. The point I’m trying to make is that if Andrew Neil has used the Met Office’s decadally smoothed data, then this averaging procedure will produce, or enhance, a slowdown in the last decade simply because there is no data beyond 2013. The procedure will over-emphasise any difference between the temperature anomaly and CO2 concentrations and therefore should be interpreted carefully.
I get the impression that Andrew thinks that the only thing that matters is that the data is from the Met Office. No, this is not quite correct. If you’re going to use data from some source, you also need to understand how the data was produced and should put some effort into ensuring that your interpretation of the data is appropriate. You can’t simply plot some data on a graph and then pontificate about what it means. Science is a little more complicated than that.