Climate sensitivity is defined as the average increase of the temperature of the Earth that you get (or expect) by doubling the amount of CO2 in the atmosphere – from 0.028% in the pre-industrial era to the future value of 0.056% (expected around 2100).
Recall that the contribution of carbon dioxide to the warming is expected because of the “greenhouse” effect and the main question is how large it is. The greenhouse effect is nothing else than the absorption (of mostly infrared radiation emitted by the Earth) by the “greenhouse” gases in the atmosphere, mainly water vapor – but in this case we are focusing on carbon dioxide, one of the five most important gases causing this effect after water vapor.
If you assume no feedback mechanisms and you just compute how much additional energy in the form of infrared rays emitted by (or reflected from) the surface will be absorbed by the carbon dioxide (refresh your knowledge about Earth’s energy budget), you obtain the value of 1 Celsius degree or so for the climate sensitivity.
While the feedback mechanisms may shift the sensitivity in either direction, Prof. Richard Lindzen of MIT, a world’s leader in the sensitivity issue, will convince you that the estimate is about right but the true value, with the mostly unknown feedback mechanisms, is likely to be lower than the simple calculation. One of the reasons, Lindzen’s own, is a negative feedback by water vapor and clouds. There is however another issue here: The dependence of the temperature on the CO2 concentration is not linear but rather “sublinear”. Why is it so?
You should realize that the carbon dioxide only absorbs the infrared radiation at certain frequencies, and it can only absorb the maximum of 100% of the radiation at these frequencies. By this comment, I want to point out that the “forcing” – the expected additive shift of the terrestrial equilibrium temperature – is not a linear function of the carbon dioxide concentration. Instead, the additional greenhouse effect becomes increasingly unimportant as the concentration increases: the expected temperature increase for a single frequency is something like
- 1.5 ( 1 – exp[-(concentration-280)/200 ppm] ) Celsius
The decreasing exponential tells you how much radiation at the critical frequencies is able to penetrate through the carbon dioxide and leave the planet. The numbers in the formula above are not completely accurate and the precise exponential form is not quite robust either but the qualitative message is reliable. When the concentration increases, additional CO2 becomes less and less important.
In particular, there exists nothing such as a “runaway effect” or a “point of no return” or a “tipping point” or any of the similar frightening fairy-tales promoted by Al Gore and his numerous soulmates. The formula above simply does not allow you more than 1.5 Celsius degrees of warming from the CO2 greenhouse effect. Similar formulae based on the Arrhenius’ law predicts a decrease of the derivative “d Temperature / d Concentration” to be just a power law – not exponential decrease – but it is still a decrease.
One might also want to obtain a better formula by integrating the formula above over frequencies:
In all cases, such a possible warming distributed over centuries is certainly nothing that a person with IQ above 80 should be producing movies about and nothing that should convince him to stop the world economy.
When you substitute the concentration of 560 ppm (parts per million), you obtain something like 1 Celsius degree increase relatively to the pre-industrial era. But even if you plug in the current concentration of 380 ppm, you obtain about 0.76 Celsius degrees of “global warming”. Although we have only completed about 40% of the proverbial CO2 doubling, we have already achieved about 75% of the warming effect that is expected from such a doubling: the difference is a result of the exponentially suppressed influence of the growing carbon dioxide concentration.
As Richard Lindzen likes to say, it is just like when you paint your bedroom. The first layer of white makes a lot of difference in the amount of light in that room; additional layers make a smaller contribution.
The first calculation of the climate sensitivity, based on the Stefan-Boltzmann law, was published by the Swedish chemist Arrhenius in 1896: it had some problems but it was a fair starting point. The Carbox Dioxide Calculator on junkscience.com is based on my simple exponential formula and you must take the exact resulting number with a grain of salt.
More exact treatment: Why is the greenhouse effect a logarithmic function of concentration?However, my simple exponential formula agrees with the logarithmic Arrhenius formula plus minus 50% up to 1000 ppm or so, expected around 2300. The changes in the emission by the surface of the Earth can be linearized although they depend as “T^4” on the temperature because the expected increase of “T” is at most 2 degrees, less than one percent of the normal “room” temperatures of 290 degrees above the absolute zero.
In reality, the increase of the temperatures since the pre-industrial era was comparable or slightly smaller than 0.76 Celsius degrees – something like 0.6 Celsius degrees. It is consistent to assume that the no-feedback “college physics” calculation of the CO2 greenhouse effect is approximately right, and if it is not quite right, it is more likely to be an overestimate rather than an underestimate, given the observed data.
The numbers and calculations above are actually not too controversial. Gavin Schmidt, a well-known alarmist from RealClimate, more or less agrees with the calculated figures, even though he adds a certain amount of fog – he selectively constructs various minor arguments that have the capacity to “tilt” the calculation above in the alarmist direction.
Richard Lindzen would tell you a lot about likely negative (regulating) feedback mechanisms (the iris effect?). Your humble correspondent finds all these mechanisms – positive or negative – plausible but neither of them can really be justified by the available, rather inaccurate data.
But the figure of 1 Celsius degree – understood as a rough estimate – seems to be consistent with everything we see and Schmidt himself claims that only intellectually challenged climate scientists estimate the sensitivity to be around 5 Celsius degrees (I forgot Schmidt’s exact wording). It is also near the result of 1.1 Celsius degrees obtained by Stephen Schwartz in 2007.
Three weeks ago, Hegerl et al. have published a text in Nature that claims that the 95 percent confidence interval for the climate sensitivity is between 1.5 and 6.2 Celsius degrees. James Annan decided to publish a reply (with J.C. Hargreaves). As you might know, James Annan – who likes to gamble and to make bets about the global warming – is
- an alarmist who believes all kinds of unreasonable things about the dangerous global warming;
- a staunch advocate of the Bayesian probabilistic reasoning.
However, he decided to publish a reply that
- the actual sensitivity is about 5 times smaller than the Hegerl et al. upper bound which means that the warming from the carbon dioxide won’t be too interesting;
- Hegerl et al. have made errors in statistical reasoning; the error may be summarized as an application of rationally unjustified Bayesian priors which is an unscientific step.
The second point of Annan is based on the observation that Hegerl et al. simply use a “prior” (a random religious preconception that defines our “primordial state of ignorance” before the sin involving the apple, so to say) that is a crucial part of the Bayesian statistical reasoning. In this particular case, the Hegerl prior simply allows the sensitivity to be huge a priori – and such a dogma to start with is simply too strong and is not removed by the subsequent procedure of “Bayesian inference”.
Such an outcome is a typical result of Bayesian methods in many cases: garbage in, garbage out. If your assumptions at the beginning are too bad, you won’t obtain accurate results after any finite time spent by thinking. Although I don’t want to claim that Annan’s reply was a great paper, I am convinced that the fact that Annan was able to appreciate these incorrect points of Hegerl et al. is partially a result of my educational influence on James Annan. 😉
Nevertheless, Annan’s reply was rejected by Nicki Stevens of Nature without review with the following cute justification:
- We have regretfully decided that publication of this comment as a Brief Communication Arising is not justified, as the concerns you have raised apply more generally to a widespread methodological approach, and not solely to the Hegerl et al. paper.
In other words, Annan’s reply could have the ability to catch errors that influence more than one paper, and such replies are not welcome. Imagine that Nicki Stevens is the editor of “Annals der Physik” instead of Max Planck who received Albert Einstein’s paper on special relativity. Even better, you can also imagine that Nicki Stevens is the editor who receives the paper on General Relativity whose insights apply more generally. 😉 Or any other paper that has any scientific meaning, for that matter, because meaningful science simply must be general, at least a little bit.
When we apply my reasoning more generally to a widespread methodological approach of many editors (and journalists), we could also wonder whether the person named Nicki Stevens realized that one half of the internet was going to discuss how unusually profound her misunderstanding of the scientific method was. She seems to believe that scientists should be just little ants who are adding small pieces of dust to a pyramid whose shape has already been determined by someone else, outside science, for example by Al Gore.
See also the Climate Swindle documentary.
Other frequently visited climate articles on The Reference Frame
- Viscount Monckton & climate alarmism
- Naomi Oreskes & fake consensus on global warming
- Temp decided about CO2 concentration, not the other way around
- IQ2 US duel: skeptics outshine alarmists
- Correlations of the Sun and cosmic rays – and temperatures
- 2006: a not too good year for chicken little’s
- 2006: lowest average temperature since 2001