Anthony Watts has a new post at Watts Up With That (WUWT) called an illustration that CO2 won’t roast the Earth in a runaway tipping point. He shows the following figure which illustrates atmospheric CO2 concentrations in previous geological time periods. The argument seems to be that it’s been much higher than it is now, so the Earth can’t reach some kind of tipping point where it will undergo a runaway greenhouse process.
Okay, so Anthony may actually have a point. A runaway process is unlikely. But, this is really a strawman argument. Noone’s really suggesting that the reason we need to act is to prevent some kind of runaway process. What people are suggesting is that if we continue to add CO2 to our atmosphere at the current rate, we will double atmospheric CO2 concentrations – relative to pre-industrial levels – in about 80 years time (maybe sooner if the rate increases). There is strong evidence to suggest that the Equilibrium Climate Sensitivity is close to 3oC. So, eventually (within the next few hundred years) the Earth will be 3oC warmer than it was in pre-industrial times. This is hotter than its been for all of human history and will almost certainly lead to damaging climate change. Of course, if we continue to add CO2 so that concentrations more than double, the warming will be even greater.
So, the figure that Anthony should probably have shown is the one below. This shows atmospheric CO2 concentrations and temperature for the past 500 Million years. What’s clear is that when the CO2 levels were much higher than they are today, the surface temperature was much higher than it is today (25oC rather than 15oC). Furthermore, hundreds of millions of years ago, solar insolation was about 6% less than it is today. This means that the equilibrium temperature would be about 4oC less than it is today (there would probably be an even bigger difference in surface temperature given that this will change the temperature gradient in the atmosphere).
So, recent pre-industrial CO2 concentrations were about 300 ppm. To get to levels seen hundreds of millions of years ago (say 5000 ppm) would require about 4 doublings. Given that solar insolation was about 6% less than today, this means the CO2 has to provide (through forcings and feedbacks) about 15 degrees of warming, compared to what it is providing today. Hence, about 3 – 4oC of warming per doubling. I appreciate that this is a simplistic calculation that ignores many subtleties and details, but (unless I’ve made some kind of silly mistake) the data presented by Anthony in his post is essentially entirely consistent with current estimates that climate sensitivity is about 3oC per doubling of atmospheric CO2.
So, yes atmospheric CO2 in the past has been much higher than it is today. Yes, it’s unlikely that we are risking some kind of runaway process. However, what this past history tells us is that CO2 concentrations play a crucial role in setting surface temperatures and this data also tells us that climate sensitivity is probably around 3oC per doubling. This means that if we continue as we are we will probably have locked in 3 degrees of warming by the mid- to late-21st century. So, if Anthony could just extend his analysis a little he could make a positive contribution to the debate. We could all agree that we’re heading for at least 3 degrees of warming (if we do nothing) and could then focus on the discussion we should be having which is, should we do something about this and, if so, what?