A few days ago I posted a video that, in less than 4 minutes, rebutted 10 common climate myths. The most amusing was the emphatic No to the claim that it’s because the Sun is getting hotter. To a certain extent, this is all that one need say with respect to this claim. It really can’t just be the Sun. Since then, however, I’ve encountered David Stockwell’s work suggesting that it is the Sun because the rise in surface temperature correlates well with the cumulative Total Solar Irradiance (TSI).
Now, I haven’t actually read David’s papers so can’t really comment about what he’s proposed. You may think that it is somewhat unreasonable to criticise someone’s work without actually reading it. There is, however, a good reason for this. It’s because it really can’t be the Sun. Why is this? A very obvious reason relates to climate sensitivity. The surface temperature has increased by 0.85oC since pre-industrial times. There is still a 0.7 Wm-2 radiative imbalance. If you’re suggesting that this is a response to a change in solar forcing, then you’re suggesting that the change in solar forcing since pre-industrial times has resulted in an increase in equilibrium surface temperature of around 1oC.
The problem, though, is that the solar forcing today is only about 0.1 Wm-2 greater than it was in pre-industrial times. An increase in surface temperature of 1oC produces an increase in surface flux of around 5.4 Wm-2. That would imply an incredibly large amplification factor. There is no evidence to suggest that our climate is that sensitive to changes in solar forcing. Furthermore, if it were that sensitive to changes in solar forcing, it would also be very sensitive to changes in other external forcings, such as anthropogenic forcings. The climate can’t be incredibly sensitive to changes in solar forcing while having no sensitivity at all to changes in other forcings. It really doesn’t make sense.
One of the reasons I wanted to write this post, though, was that there was something I’d been meaning to work out. An alternative to the system being very sensitive to changes in solar forcing would be that, for some unknown reason, the surface temperature in the mid-1800s was 1oC below the actual equilibrium value. Since then, therefore, we’ve simply been recovering towards equilibrium. If the equilibrium temperature is To and the current surface temperature is T, then the amount of energy, dE, that the system will accrue in a time dt is
where A is the surface area of the Earth.
However, as the Earth accrues energy the surface temperature, T, will rise and so we need some estimate of how this will change with energy. The land and atmosphere have a mass of around 1019kg and a specific heat capacity of 1000 J kg-1 K-1. This means that it would take 1022J of energy to increase the temperature of the land and atmosphere by 1oC. The oceans, however, have a heat capacity 100 times greater than that of the land and atmosphere. So, increasing the temperature by 1oC would require 1024 J. As a check of this, the ocean heat content has increased by about 2.5 x 1023 J since 1960, while surface temperatures have risen by about 0.6oC. That would imply 5 x 1023 J per degree, but 1024J per degree is close enough for what I’m trying to illustrate here. So, the other condition we need then is
So, I wrote a short computer code that would work out how the temperature would change with time if initially T = 288 K, To = 289 K, the energy changes according to the first equation above, and the temperature changes according to the second equation. The result is shown in the figure below. Basically, if we were simply recovering towards equilibrium from being 1oC below equilibrium, it would take only 20 years for temperatures to rise by 0.85oC and within 60 years we’d be virtually at equilibrium. Also, the profile is clearly not linear. The dashed-line simply shows what would happen if we happened to end up 1oC degree above equilibrium. Bear in mind that my estimate for the relationship between energy and temperature is possibly on the high side, so it may actually be faster. Also, this is quite different to what would happen if a pulse of energy (from an ENSO event for example) were to suddenly heat the land and atmosphere. That would decay very quickly – probably only a few months.
This post was partly just a bit of fun. I had really just wanted to check how long it would take for the temperature to rise if it just happened to start 1oC below equilibrium. Essentially, far faster than the timescale it has actually taken for surface temperatures to rise by 0.85oC, and with a temporal profile that is very different to what’s been observed. Ultimately, however, it wasn’t really necessary to do that calculation. Basically we know that it can’t be the Sun because that would imply that the climate is much more sensitive to changes in external forcings than other evidence suggests, and this would apply both to changes in solar forcings and to changes in anthropogenic forcings. You can’t suggest that it is both much more sensitive to changes in solar forcings than expected while, at the same time, being much less sensitive to anthropogenic forcings.