Over the past 30 years, they are at least predicting 71% too much heat. Maybe 159%. (see graph)
The figure he’s referring to is below. Here’s problem number one. These are temperature anomalies and so are relative to an arbitrary baseline (in this case, adjusted to zero 1979-1982). The percentage difference that Bjorn calculates depends entirely on the choice of baseline. If it was set relative to 1990, it would be massive. If relative to 1950, much smaller. An entirely meaningless calculation. The 159% presumably refers to the difference between the models and the satellite data. Well, the model results are presumably for the surface, so what is he doing comparing this with tropospheric temperatures (I believe, at least, that these are lower troposphere measurements)? He’s also considering only the ensemble model mean (I believe) and ignoring that climate models are still consistent with recent surface warming at the few percent level.
Additionally, Bjorn says the current climate models are running way too hot. I actually don’t think they are (maybe someone who knows more could confirm). They may be currently over-estimating the level of surface warming, but I think that they have been quite good at predicting overall warming. The planet continues, according to ocean heat content data, to accrue energy at the rate of 1022 J per year. So, yes, surface warming is currently slower than expected, but overall warming is not.
Bjorn finishes his post by saying
Yes, there is a problem, no, it doesn’t look like the end of the world.Let’s fix global warming without the fear.
Indeed, it doesn’t look like the end of the world. Yes, let’s fix global warming. Let’s do so, however, without underestimating and mis-representing what we are likely to face.