The Economist Provides Readers With Erroneous Information About Arctic Sea Ice | Watts Up With That?
WUWT regular “Just The Facts”:
A June 16th article in the Economist “The vanishing north” states that;
A June 16th article in the Economist “The vanishing north” states that;
“Between now and early September, when the polar pack ice shrivels to its summer minimum, they will pore over the daily sea ice reports of America’s National Snow and Ice Data Centre. Its satellite data will show that the ice has shrunk far below the long-term average. This is no anomaly: since the 1970s the sea ice has retreated by around 12% each decade. Last year the summer minimum was 4.33m square km (1.67m square miles)—almost half the average for the 1960s.
The Arctic’s glaciers, including those of Greenland’s vast ice cap, are retreating. The land is thawing: the area covered by snow in June is roughly a fifth less than in the 1960s. The permafrost is shrinking. Alien plants, birds, fish and animals are creeping north: Atlantic mackerel, haddock and cod are coming up in Arctic nets. Some Arctic species will probably die out.
Perhaps not since the 19th-century clearance of America’s forests has the world seen such a spectacular environmental change. It is a stunning illustration of global warming, the cause of the melt. It also contains grave warnings of its dangers. The world would be mad to ignore them.”
Published on Jun 16, 2012 by dutchsinse (youtube)
-The Earth is Rattling
-As you see above,
there is a concentrated movement along the west side of the Ring of Fire,
extending South to Australia and New Zealand,
extending West to Middle East, all the way to Italy.
Because this West side of the Ring has been hot for the past week,
dutchsinse believes there will be additional movements in the East side of the Ring,
which includes the U.S., and Central + South America.-
-The recent quake in Italy killed many people.
And the continuous quakes and rumbles going on around the world seems to be increasing,
whether they are caused by planet alignments, Sun flares, CME, Fracking operations, HAARP,
it is clear that our Earth is reacting aggressively to a new energy pattern.
Please be on a look out,
if you live near the Ring of Fire region,
the reach of this hot seismic zone is reaching further inland, as time goes.
Please pay extra attention to your local weather channel and updates,
or subscribe to dutchsinse directly.
Also what is up with these unreported quakes and rumbles?
Why are some quakes go under suppression, under the radar,
while most go on public record?
What is different about these quakes that require so much secrecy?
Why are some weather anomalies not reported officially?
When most are?
Posted on: Thursday, June 14, 2012 12:37 PM
|Walter Anthony et al (2012) have made a major contribution to the picture of methane emissions from thawing Arctic regions. Not a game-changer exactly, but definitely a graphics upgrade, bringing the game to life in stunningly higher resolution (/joke).
Katey Walter Anthony draws upon her previous field findings that methane emissions from the Arctic landscape tend to be focused at the intersection between frozen and thawed, in particular in rings around a peripheries of . She also knew what a methane seep looks like in that landscape, leaving visible bubbles frozen into the ice or maintaining an unfrozen hole in the ice. Now she takes to the skies to produce an aerial survey of the Alaskan landscape, data that is so much more voluminous than before that it becomes different in kind.
The methane emission fluxes are higher than previous estimates, but that’s not really the most important point, because emissions from the Arctic are small relative to low-latitude wetlands, and doubling or even nearly quadrupling the Arctic fluxes (in one of their analyzed regions), they would still be small in terms of global climate forcing. And the lifetime of methane in the atmosphere is short, about 10 years, so methane doesn’t build up like CO2, SF6, and to a lesser extent N2O do.
The really interesting take-away from the new paper is how it shows that the near-surface geology and freezing state conspire to control the venting of accumulated gas dribbling up from below, and the decompostion of frozen soil carbon. They have so many methane seep observations that they are able to correlate them with (1) currently melting permafrost, which allow fossil soil carbon deposits from the last ice age called Yedoma to decompose (Zimov et al 2006) and (2) melting ice sheets and glaciers “un-crunching” the landscape as they fade away, making cracks that vent methane from deep thermal sources. Glaciers that melted long ago no longer vent methane, showing that the methane is transiently venting from built-up pools of gas.
What these results do not do is fundamentally change the game, in my opinion. We can now see more clearly that most of the methane flux from the Arctic today are of types of emission that will respond to climate warming. But the general response time of the system is slow, decades to centuries, rather than potentially poised to release a huge pulse of methane within a few years. Earthquakes and submarine landslides are sudden events, but small individually in terms of potential methane release. The new data do not change that. Walter Anthony et al. compare an estimate the amount of methane in the Arctic, 1200 Gton C, with the 5 Gton C of methane in the atmosphere. That’s the nightmare comparison, but it’s only really relevant if the methane comes out all at once. (The Arctic estimate is for methane itself and is mostly methane hydrate, but keep in mind that there is also a comparable amount of decomposable soil carbon.)
In my opinion, the largest impact of all this methane will probably be to the long-term future evolution of climate. Avoiding a peak warming of 2 degrees C or more requires keeping the total emission of carbon down to less than about 1000 Gton C (Allen et al 2009). We have already burned about 300 Gton C, and cut 200 Gton C although the land surface has taken up enough carbon to achieve net wash. So maybe we’re 1/3 of the way there, say 700 Gton C left to go. The 1200 Gton C of Arctic methane hydrates and the permafrost carbon stack up pretty menacingly against our 700 Gton left to go, and the comparison is relevant even if the carbon is emitted slowly, or as CO2 rather than methane, or even if it is released into the ocean rather than into the air (it will still equilibrate with the atmosphere, after a few centuries, converging to the same “long tail” CO2 trajectory that would have resulted from atmospheric release).
Arctic methane, and all that frozen soil carbon, could easily play a huge role, not so much in the near-term evolution of Earth’s climate, but in the long tail of the global warming climate event.
Allen M.R., D.J. Frame, C. Huntingford, C.D. Jones, J.A. Lowe, M. Meinshausen & N. Meinshausen (2009) Warming caused by cumulative carbon emissions towards the trillionths tonne. Nature 458 doi:10.1038/nature08019
Walter Anthony, K.M., P. Anthony, G. Grosse, & J. Chanton (2012) Geologic methane seeps along boundaries of Arctic permafrost thaw and melting glaciers. Nature Geoscience doi:10.1038/NGEO1480
Zimov, S.A., Schuur, E.A.G, and F. Stuart Chapin III, F. (2006) Permafrost and the Global
Posted on: Tuesday, May 22, 2012 5:29 AM
|In the Northern Hemisphere, the late 20th / early 21st century has been the hottest time period in the last 400 years at very high confidence, and likely in the last 1000 – 2000 years (or more). It has been unclear whether this is also true in the Southern Hemisphere. Three studies out this week shed considerable new light on this question. This post provides just brief summaries; we’ll have more to say about these studies in the coming weeks.
First, a study by Gergis et al., in the Journal of Climate [Update: this paper has been put on hold – see comments] uses a proxy network from the Australasian region to reconstruct temperature over the last millennium, and finds what can only be described as an Australian hockey stick. They use an ensemble of 3000 different reconstructions, using different methods and different subsets of the proxy network. Worth noting is that while some tree rings are used (which can’t be avoided, as there simply aren’t any other data for some time periods), the reconstruction relies equally on coral records, which are not subject to the same potential (though often-overstated) issues at low frequencies. The conclusion reached is that summer temperatures in the post-1950 period were warmer than anything else in the last 1000 years at high confidence, and in the last ~400 years at very high confidence.
Second, Orsi et al., writing in Geophysical Research Letters, use borehole temperature measurements from the WAIS Divide site in central West Antarctica, a region where the magnitude of recent temperature trends has been subject of considerable controversy. The results show that the mean warming of the last 50 years has been 0.23°C/decade. This result is in essentially perfect agreement with that of Steig et al. (2009) and reasonable agreement with Monaghan (whose reconstruction for nearby Byrd Station was used in Schneider et al., 2012). The result is totally incompatible (at >>80% confidence) with that of O’Donnell et al. (2010).
This result shouldn’t really surprise anyone: we have previously noted the incompatibility of O’Donnell et al. with independent data. What is surprising, however, is that Orsi et al. find that warming in central West Antarctica has actually accelerated in the last 20 years, to about 0.8°C/decade. This is considerably greater than reported in most previous work (though it does agree well with the reconstruction for Byrd, which is based entirely on weather station data). Although twenty years is a short time period, the 1987-2007 trend is statistically significant (at p
We and others have shown (e.g. Ding et al., 2011), that the rapid warming of West Antarctica is intimately tied to the remarkable changes that have also occurred in the tropics in the last two decades. Note that the Orsi et al. paper actually focuses very little on the recent temperature rise; it is mostly about the "Little-ice-age-like" signal of temperature in West Antarctica. Also, these results cannot address the question of whether the recent warming is exceptional over the long term — borehole temperatures are highly smoothed by diffusion, and the farther back in time, the greater the diffusion. We’ll discuss both these aspects of the Orsi et al. study at greater length in a future post.
Last but not least, a new paper by Zagorodnov et al. in The Cryosphere, uses temperature measurements from two new boreholes on the Antarctic Peninsula to show that the decade of the 1990s (the paper state “1995+/-5 years”) was the warmest of at least the last 70 years. This is not at all a surprising result from the Peninsula — it was already well known the Peninsula has been warming rapidly, but these new results add considerable confidence to the assumption that that warming is not just a recent event. Note that the “last 70 years” conclusion reflects the relatively shallow depth of the boreholes, and the fact that diffusive damping of the temperature signal means that one cannot say anything about high frequency variability prior to that. The inference cannot be made that it was warmer than present, >70 years ago. In the one and only century-long meteorological record from the region — on the Island of Orcadas, just north of the Antarctica Peninsula — warming has been pretty much monotonic since the 1950s, and the period from 1903 to 1950 was cooler than anything after about 1970 (see e.g. Zazulie et al., 2010). Whether recent warming on the Peninsula is exceptional over a longer time frame will have to await new data from ice cores.
The temperatures and carbon dioxide concentrations have been correlated – see e.g. Petit et al., Nature 1999 – but we know for sure that the temperature was the cause and the concentration was its consequence, not the other way around. This fact has also been explained in The Great Global Warming Swindle. It follows that the C0₂ greenhouse effect has not been important in the history and we shouldn’t expect that it will become important in the future.
Special comment for Australian readers on Sep 28, 2007: just yesterday, there was a new paper in Science – Lowell et al., Science 2007 – that showed that CO₂ lagged by about 1,000 years when the last ice age started to end 18,000 years ago
The direction of the causal relationship can be shown in many ways: for example, it is not just CO₂ but other gases such as methane that follow temperature. The hypothesis of CO₂ as the primary reason wouldn’t explain why these other gases are correlated, too. Also, we understand how oceans react to temperature changes by releasing gases. Finally, the gas concentrations lag behind the temperature by 800 years, see e.g. this 2003 paper in Science by Caillon et al.
- See also: climate sensitivity & nonlinear relationship between CO₂ and temperature
- MS Word introduction to the climate debate
The movie of the former future U.S. president – "An Inconvenient Truth" – has impressed many viewers: it is an optimized promotion of the alarmist understanding of the global climate. Moreover, it shows a more attractive Al Gore than the old Al Gore whom we know from the 2000 campaign.
A few years ago, Gore visited Harvard and with Jochen Brocks, my fellow Fellow, we went to see him. Jochen is a leftist, of course, but he claimed that Gore looked repulsive, unhuman, and evil. I am a rightist but paradoxically, I never had terribly serious complaints about Gore’s looks.
Don’t get me wrong: I certainly think that George Bush is more human and looks like a more trustworthy and more human being than Al Gore, and I wish him the best on his 60th birthday! Nevertheless, their design is not the primary thing that determines my political and scientific opinions.
That’s why I am going to discuss more important issues, namely the scientific ones. The most powerful argument in Al Gore’s movie were the graphs showing the correlation between the carbon dioxide concentrations and the temperature extracted from ice data in the last 650,000 years.
Figure 1: Correlations between the temperature and the concentrations extracted from the ice cores (click to zoom in). Combined graph by Thomas Stocker of University of Bern, Switzerland
(Incidentally, if you care, the concentration of CO2 and CH4 is determined by a direct chemical analysis of the bubbles. The temperature is reconstructed from the concentration of frozen heavy water – water with the one normal hydrogen atom H replaced by the heavy hydrogen D, known as the deuterium – in the same bubbles. Why? Because the warmer weather there was, the easier it was for heavy water vapor molecules to get to a sufficient altitude – think about the Maxwell-Boltzmann distribution – and to join clouds whose precipitation was adding ice to the ice sheet during the same year.)
No doubt, the correlation between the temperature and CO₂ is nearly perfect. No doubt, the climate on the Earth in the 650,000 years before the industrial revolution can be described very accurately by a single function of time. But if two things, A and B, are correlated, does it imply a particular causal relationship?
In classical physics, the answer is essentially Yes. The perfect correlation must either mean that A is caused by B, or B is caused by A, or both A and B are caused by something else, namely D. It is completely clear what is Al Gore’s answer: the temperature was determined by the concentrations of carbon dioxide. That’s why all of us are going to die in a hell by The Independence Day 2016 unless all of us accept Al Gore as the ultimate savior, neglecting that he is not a Christ but rather an anti-Christ as Rae Ann has noticed. 😉
Figure 2: Climate scientists extract the ice cores.
According to Gore, the concentration of carbon dioxide from the ice core records (see the picture above) was evolving according to its free will and does not require any explanation. The concentration could have been caused by oil companies owned by various mammoths. At any rate, Al Gore does not have to answer why the carbon dioxide concentration was changing in the first place. He does not have to answer because he is the savior.
Now imagine that you have the freedom to think about these things rationally, as opposed to metareligious quasithinking under the influence of crazy brainwashing. First, let us try with the following exercise.
What is the cause and what is its effect
Imagine that you find out that whenever you smell methane in the living room, you can also find a certain person in the same room. The correlation is nearly perfect. What is the conclusion? Someone could propose that the methane in the room is the cause whose presence creates the person. I would propose an "alternative" explanation: it is the person who creates the methane whenever he is in the room. Choose any explanation you want.
I picked methane because it will play a role in the main example, too.
You should notice that the graph above shows a perfect correlation not only between the temperature (A) and the carbon dioxide concentration (B), but also between the temperature (A) and the methane concentration (C). What is the cause and what is the consequence if three quantities are correlated so nicely?
Note that the answer can’t be unique a priori. At most one of the three quantities – A,B,C – can be the primary cause. Which one? Clearly, if you choose one of the gases, your explanation will be asymmetric and it won’t explain all the correlations in a satisfactory way. If you say that the carbon dioxide concentration determines the temperature, you must still explain why the methane concentration (and other concentrations such as N2O, for that matter) follows the same time dependence. You will clearly need a different explanation. If the CO₂ greenhouse effect is primary, you can’t explain why the concentrations of CO₂ and CH4 coincide. Unless you find another inevitable explanation of this subtlety, your theory will be very weak.
Actually, we have more than logical arguments of this kind. We know very well why the causal relation is the opposite one. Imagine that you have a small bottle with 385 milliliters of Coke. It originally contained 4 volumes of carbon dioxide: if you extract carbon dioxide from one bottle of Coke to empty bottles at normal conditions, you will fill four bottles. I had to learn these things when we discussed various thermodynamical issues with Brian Greene when he was writing his second excellent book. Now, imagine that the CO₂ has leaked a bit and there is only 1 volume of CO₂ left in the bottle.
Take this bottle to your car whose internal volume is 1 cube meter i.e. 1 million milliliters. The carbon dioxide from the Coke makes 385 ppm (parts per million) of the volume of your car – just like the ratio in the atmosphere.
Suddenly, you notice a strange correlation between the concentration and character of the bubbles in the bottle on one side, and the temperature in your car on the other side. You will have two possible interpretations. Either the leaking CO₂ in the Coke determines the temperature in your car because the Coke with more CO₂ is a bit darker for the Sun that is shining to your windows (or for the infrared rays reflected from the chairs), or the temperature in your car determines how the bubbles behave in the bottle. Which explanation do you choose? 😉
I think that any sane person obviously chooses the temperature as the cause and the concentrations as a consequence. Everyone who has ever tried to open a bottle of lemonade during a hot day must know why. Hot liquids are not able to absorb gases so well. Warmer oceans are not able to absorb atmospheric gases either. Clearly, if the temperature goes up, less carbon dioxide and methane can be bound to the ocean waters, which is why their concentration in the atmosphere goes up: this process is known as outgassing.
This explanation obviously works both for CO₂ as well as CH4 and other gases that could appear such as N2O.
There are many other mechanisms that contribute to the correlation between the temperature and the concentrations. For example, the frequency of fires may increase when temperatures are warmer, and fires create more CO₂. Also, the growth of plants and animals (consumers and producers of CO₂) depends on temperatures – but the most important contributions to the correlation work in such a way that the temperature is primary and the concentrations are secondary. If you think for a while, you will realize that the example with the car is actually pretty much realistic and the ability of water to bind gases is much stronger an effect than the greenhouse effect.
Even if you did not believe that conclusion and preferred the Al Gore’s explanation that methane and CO₂ create the person or the warming, you will have problems to predict the future. While the correlation between A,B,C was nearly perfect in the past, we have violated this perfect harmony because we produce CO₂ and CH4 at different rates. We can deliberately do so. You won’t get any natural prediction for the temperature because the correlation data itself can’t tell you how much the two gases contribute.
Figure 3: A map of the Vostok lake. The deeper you go, the further you get to the history because the ice was being added at the top. Different years look like different layers of ice, in analogy with tree rings.
You should better look at physics, and physics tells you quite clearly that the ability of water to bind gases is more important an effect for the correlation than the greenhouse effect, and this fact will influence the measurements from The Subglacial Lake Vostok System, a Russian center in Antarctica (see drawing above). The temperature is the primary cause of secondary quantities such as various concentrations – and I would expect advocates of a "global warming" theory to agree with me that the temperature should be the fundamental quantity. This description explains all the correlations and not just some of them.
Much like all other potential explanations, it still says nothing about the origin of the "primary" quantity, in this case temperature. If temperature is indeed the primary and fundamental quantity, why was it changing the way it did?
There are many contributions to the temperature variations we partially know – such as various periodic astronomical cycles or solar variation – and there are many others that we don’t know well or we don’t know at all – such as nonlinear chaotic effects in the formation of different kinds of clouds. But I think that even though we don’t know some things for sure and in their entirety, we can still be pretty much sure that certain hypotheses are almost certainly incorrect. The hypothesis that the CO₂ concentration was primary and it determined the CH4 concentrations and the temperature is one of such extremely unlikely hypotheses.
And that’s the memo. But let us add a cute and important point.
What does the 800-year lag mean
There exists a simpler way to show that the temperature was the cause and the carbon dioxide concentration was a consequence, not the other way around. If you look carefully at the graphs, you will see that the carbon dioxide concentrations lag behind the temperature by 800 years. There have been many papers that found and reported the lag. One of the newest and most accurate ones is this 2003 paper in Science by Caillon et al. (full text, click).
On the graph above (click to zoom in), the past is on the right side, time goes to the left. You can see that the Antarctic temperature starts to change first, and CO₂ responds with a 800-year lag. Methane is still correlated with both. The graph is not new. Today, we have many more accurate graphs of this kind, many of which are from more distant past. We also have a more detailed analysis by Stott et al. (Science 2007) of the end of the last ice age 19,000 years ago where CO₂ lagged by about 1,000 years, too.
The explanation is obvious: oceans are large and it simply takes centuries for them to warm up or cool down before they release or absorb gases.
The work proving the lag was recently explained in Scientific American as well as RealClimate where they also essentially claim that you can easily produce a time machine as long as you want to travel only 800 years – or anything less than 5,000 years – to your past. 😉
See also: CO₂ lag and how alarmists think
I leave it up to you whether you learn just the hard data or also their bizarre interpretation, and whether you will think that the RealClimate people are sane according to this interpretation. I personally don’t think so. They would be right if they said that 90% of the time, the temperature and gas concentrations move together, and if you could hide the remaining 10% of the data, you couldn’t learn the direction of the causal relationship.
But scientists who don’t want to close their eyes can look at these critical 10% of the data, too. The result of such an analysis is that the impact of temperatures on gas concentrations is much stronger than the opposite influences, including the greenhouse effect. This fact can be extracted from the time periods where the trend is changing but because the physical laws themselves don’t change, it is very clear that in the remaining periods, it is still true that the influence of temperature on the gases is stronger than the opposite influence. The only way to hide this conclusion is censorship, witch hunts, and burning of heretics at stake. There is no scientific way to deny this clear conclusion from the data.
The comments in some of these articles that the greenhouse effect could still be important is just fog that the authors included in order for their "politically incorrect" scientific conclusions to get published. This fog was probably incorporated into these papers by reviewers-alarmists, but this fog makes no sense whatsoever.
It follows from an analysis of the data that the greenhouse effect couldn’t have been too important at the multi-millenium timescale.
Appendix: Gore’s lift
If you have seen Al Gore’s movie, you may also remember the lift. He argued that because there has been a correlation between CO2 and the temperature during the glaciation cycles, the significant recent growth of CO2 may be directly translated to a huge warming.
But we already know that this prediction is falsified, either by understanding the opposite direction of the causal relationship, as explained above, or simply by looking at the basic numbers:
During the ice ages, the concentration was 180 ppm (parts per million) and it grew to 280 ppm or so during the (warm) interglacials. This increase by 100 ppm of CO2 was accompanied by 8 °C of warming or so. But the same increase of CO2 from 280 ppm in 1800 to 380 ppm in 2005 was only accompanied by the measured 0.7 °C warming or so (even if we assume that all of the observed warming is man-made), more than one order of magnitude smaller than 8 °C. We simply know that the warming caused by CO2 is at least 10 times smaller than Al Gore tries to suggest with his exercise.
Incidentally, if you care, after many centuries or a few millenia, the correlation between CO2 and the temperature will get restored again. But the details how it will happen are inconvenient, too: in a few centuries after we stop adding CO2 to the atmosphere, the oceans will absorb or "suck" most of the "excessive" CO2. Therefore, they will also undo the small warming by 1 °C or so that the excessive CO2 has caused. The oceans have a huge capacity.
Other likable climate articles on The Reference Frame
- January 2008 was the coldest month since 1994 (HadCRUT3)
- Christopher Monckton & warmers
- Temperature changed before CO₂ concentrations, not the other way around
- Intelligence Squared US debate: deniers beat alarmists
- Temperature CO_2 sensitivity is sublinear
- Correlation of sunspots and cosmic rays vs temperature
- 2006: a painful year for chicken little’s
- 2006: colder than 2002 – 2005
Climate sensitivity is defined as the average increase of the temperature of the Earth that you get (or expect) by doubling the amount of CO2 in the atmosphere – from 0.028% in the pre-industrial era to the future value of 0.056% (expected around 2100).
Recall that the contribution of carbon dioxide to the warming is expected because of the “greenhouse” effect and the main question is how large it is. The greenhouse effect is nothing else than the absorption (of mostly infrared radiation emitted by the Earth) by the “greenhouse” gases in the atmosphere, mainly water vapor – but in this case we are focusing on carbon dioxide, one of the five most important gases causing this effect after water vapor.
If you assume no feedback mechanisms and you just compute how much additional energy in the form of infrared rays emitted by (or reflected from) the surface will be absorbed by the carbon dioxide (refresh your knowledge about Earth’s energy budget), you obtain the value of 1 Celsius degree or so for the climate sensitivity.
While the feedback mechanisms may shift the sensitivity in either direction, Prof. Richard Lindzen of MIT, a world’s leader in the sensitivity issue, will convince you that the estimate is about right but the true value, with the mostly unknown feedback mechanisms, is likely to be lower than the simple calculation. One of the reasons, Lindzen’s own, is a negative feedback by water vapor and clouds. There is however another issue here: The dependence of the temperature on the CO2 concentration is not linear but rather “sublinear”. Why is it so?
You should realize that the carbon dioxide only absorbs the infrared radiation at certain frequencies, and it can only absorb the maximum of 100% of the radiation at these frequencies. By this comment, I want to point out that the “forcing” – the expected additive shift of the terrestrial equilibrium temperature – is not a linear function of the carbon dioxide concentration. Instead, the additional greenhouse effect becomes increasingly unimportant as the concentration increases: the expected temperature increase for a single frequency is something like
- 1.5 ( 1 – exp[-(concentration-280)/200 ppm] ) Celsius
The decreasing exponential tells you how much radiation at the critical frequencies is able to penetrate through the carbon dioxide and leave the planet. The numbers in the formula above are not completely accurate and the precise exponential form is not quite robust either but the qualitative message is reliable. When the concentration increases, additional CO2 becomes less and less important.
In particular, there exists nothing such as a “runaway effect” or a “point of no return” or a “tipping point” or any of the similar frightening fairy-tales promoted by Al Gore and his numerous soulmates. The formula above simply does not allow you more than 1.5 Celsius degrees of warming from the CO2 greenhouse effect. Similar formulae based on the Arrhenius’ law predicts a decrease of the derivative “d Temperature / d Concentration” to be just a power law – not exponential decrease – but it is still a decrease.
One might also want to obtain a better formula by integrating the formula above over frequencies:
In all cases, such a possible warming distributed over centuries is certainly nothing that a person with IQ above 80 should be producing movies about and nothing that should convince him to stop the world economy.
When you substitute the concentration of 560 ppm (parts per million), you obtain something like 1 Celsius degree increase relatively to the pre-industrial era. But even if you plug in the current concentration of 380 ppm, you obtain about 0.76 Celsius degrees of “global warming”. Although we have only completed about 40% of the proverbial CO2 doubling, we have already achieved about 75% of the warming effect that is expected from such a doubling: the difference is a result of the exponentially suppressed influence of the growing carbon dioxide concentration.
As Richard Lindzen likes to say, it is just like when you paint your bedroom. The first layer of white makes a lot of difference in the amount of light in that room; additional layers make a smaller contribution.
The first calculation of the climate sensitivity, based on the Stefan-Boltzmann law, was published by the Swedish chemist Arrhenius in 1896: it had some problems but it was a fair starting point. The Carbox Dioxide Calculator on junkscience.com is based on my simple exponential formula and you must take the exact resulting number with a grain of salt.
More exact treatment: Why is the greenhouse effect a logarithmic function of concentration?However, my simple exponential formula agrees with the logarithmic Arrhenius formula plus minus 50% up to 1000 ppm or so, expected around 2300. The changes in the emission by the surface of the Earth can be linearized although they depend as “T^4” on the temperature because the expected increase of “T” is at most 2 degrees, less than one percent of the normal “room” temperatures of 290 degrees above the absolute zero.
In reality, the increase of the temperatures since the pre-industrial era was comparable or slightly smaller than 0.76 Celsius degrees – something like 0.6 Celsius degrees. It is consistent to assume that the no-feedback “college physics” calculation of the CO2 greenhouse effect is approximately right, and if it is not quite right, it is more likely to be an overestimate rather than an underestimate, given the observed data.
The numbers and calculations above are actually not too controversial. Gavin Schmidt, a well-known alarmist from RealClimate, more or less agrees with the calculated figures, even though he adds a certain amount of fog – he selectively constructs various minor arguments that have the capacity to “tilt” the calculation above in the alarmist direction.
Richard Lindzen would tell you a lot about likely negative (regulating) feedback mechanisms (the iris effect?). Your humble correspondent finds all these mechanisms – positive or negative – plausible but neither of them can really be justified by the available, rather inaccurate data.
But the figure of 1 Celsius degree – understood as a rough estimate – seems to be consistent with everything we see and Schmidt himself claims that only intellectually challenged climate scientists estimate the sensitivity to be around 5 Celsius degrees (I forgot Schmidt’s exact wording). It is also near the result of 1.1 Celsius degrees obtained by Stephen Schwartz in 2007.
Three weeks ago, Hegerl et al. have published a text in Nature that claims that the 95 percent confidence interval for the climate sensitivity is between 1.5 and 6.2 Celsius degrees. James Annan decided to publish a reply (with J.C. Hargreaves). As you might know, James Annan – who likes to gamble and to make bets about the global warming – is
- an alarmist who believes all kinds of unreasonable things about the dangerous global warming;
- a staunch advocate of the Bayesian probabilistic reasoning.
However, he decided to publish a reply that
- the actual sensitivity is about 5 times smaller than the Hegerl et al. upper bound which means that the warming from the carbon dioxide won’t be too interesting;
- Hegerl et al. have made errors in statistical reasoning; the error may be summarized as an application of rationally unjustified Bayesian priors which is an unscientific step.
The second point of Annan is based on the observation that Hegerl et al. simply use a “prior” (a random religious preconception that defines our “primordial state of ignorance” before the sin involving the apple, so to say) that is a crucial part of the Bayesian statistical reasoning. In this particular case, the Hegerl prior simply allows the sensitivity to be huge a priori – and such a dogma to start with is simply too strong and is not removed by the subsequent procedure of “Bayesian inference”.
Such an outcome is a typical result of Bayesian methods in many cases: garbage in, garbage out. If your assumptions at the beginning are too bad, you won’t obtain accurate results after any finite time spent by thinking. Although I don’t want to claim that Annan’s reply was a great paper, I am convinced that the fact that Annan was able to appreciate these incorrect points of Hegerl et al. is partially a result of my educational influence on James Annan. 😉
Nevertheless, Annan’s reply was rejected by Nicki Stevens of Nature without review with the following cute justification:
- We have regretfully decided that publication of this comment as a Brief Communication Arising is not justified, as the concerns you have raised apply more generally to a widespread methodological approach, and not solely to the Hegerl et al. paper.
In other words, Annan’s reply could have the ability to catch errors that influence more than one paper, and such replies are not welcome. Imagine that Nicki Stevens is the editor of “Annals der Physik” instead of Max Planck who received Albert Einstein’s paper on special relativity. Even better, you can also imagine that Nicki Stevens is the editor who receives the paper on General Relativity whose insights apply more generally. 😉 Or any other paper that has any scientific meaning, for that matter, because meaningful science simply must be general, at least a little bit.
When we apply my reasoning more generally to a widespread methodological approach of many editors (and journalists), we could also wonder whether the person named Nicki Stevens realized that one half of the internet was going to discuss how unusually profound her misunderstanding of the scientific method was. She seems to believe that scientists should be just little ants who are adding small pieces of dust to a pyramid whose shape has already been determined by someone else, outside science, for example by Al Gore.
See also the Climate Swindle documentary.
Other frequently visited climate articles on The Reference Frame
- Viscount Monckton & climate alarmism
- Naomi Oreskes & fake consensus on global warming
- Temp decided about CO2 concentration, not the other way around
- IQ2 US duel: skeptics outshine alarmists
- Correlations of the Sun and cosmic rays – and temperatures
- 2006: a not too good year for chicken little’s
- 2006: lowest average temperature since 2001
The temperatures and carbon dioxide concentrations have been correlated – see e.g. Petit et al., Nature 1999 – but we know for sure that the temperature was the cause and the concentration was its consequence, not the other way around. This fact has also been explained in The Great Global Warming Swindle. It follows that the C0₂ greenhouse effect has not been important in the history and we shouldn’t expect that it will become important in the future.
Our Ocean Acidification Database is an ever-growing archive of the responses of various growth and developmental parameters of marine organisms immersed in seawater at or near today’s oceanic pH level as well as lower than that of today.
Feed: Skeptical Science
Posted on: Sunday, June 24, 2012 1:28 AM
|James Powell’s iBook Going to Extremes is an informative read about the recent weather extremes around the globe, with an emphasis on the U.S. which experienced $14 billion in weather disasters in 2011, the most in history. My short review will mostly be about the advantages of this – relatively – new type of book and not so much about the content which will be very familiar to regular readers of Skeptical Science.
The iBook-format is ideal for a topic like weather extremes and their relationship with climate change as it makes it easy to include not just pictures but also videos and interactive graphics. You’ll come across videos from floodings as well as footage captured by satellite of events like the inundation of Cairo Beach:
These multi-media additions make reading this as an iBook a lot more interesting than reading it the "traditional way" as a printed book. I was especially impressed by several "before-and-after" satellite images depicting towns like Joplin before and after the tornado hit on May 22, 2011.
First you see this:
…and then – with just a little gesture or tip of your finger – this:
Another big advantage for both authors and readers of iBooks is the ease and speed with which it is possible to update them. For a book like Going to Extremes, this means that it can be kept current with regular updates to include more recent occurrences of extreme weather events. In fact, since being first published, James Powell has already added one chapter to the iBook with information up to June 6, 2012 and periodic future updates are planned.
With only around 100 iBook-pages, Going to Extremes is a very concise and quick read. For readers interested in more details, the author includes many live links to additional information available on the internet. Some of these links for example lead directly to the scientific literature supporting the theory that human caused global warming has become a major contributing factor in many extreme weather events around the globe (the author does make it clear that it’s impossible to state that any single weather event was directly caused solely by global warming but, as the Earth warms and weather systems grow more energetic, an increase in extreme events has long been anticipated).
All in all a very worthwhile book to download as you’ll get a lot of information readily available at your fingertips and at $0.99 this is really a bargain!
Posted on: Friday, June 22, 2012 11:36 PM
|This is a re-post from ClimateSight.
Posted on: Friday, June 08, 2012 11:59 AM
Over at Real Climate Economics, ACEEE’s Director of Economic and Social Analysis, Skip Laitner shares some thoughts about energy intensity and Rio+20:
To continue reading visit Skip Laitner’s blog “Desert Year” at Real Climate Economics.
Yes, U.S. Oil and Gas Production Is Increasing, but Energy Efficiency Is Still the Number One Resource
Posted on: Tuesday, June 19, 2012 6:30 AM
A variety of recent articles have trumpeted how U.S. oil and gas production is up. For example, Daniel Yergin, in a New York Times op ed, notes that U.S. oil production has increased 1.6 million barrels per day since 1998 and that a further 0.6 million barrel increase may be possible this year. He also notes how shale gas is now 37% of U.S. production, up from 2% a dozen years ago. And he quotes President Obama as saying that shale gas development had by 2010 supported 600,000 jobs (this includes direct, indirect and induced jobs). EIA notes that U.S. oil production in the first quarter of 2012 is at the highest level since 1998. These increases are largely driven by advances in hydraulic fracturing or “fracking,” allowing “tight oil” and shale gas to be profitably extracted.
While hydraulic fracturing goes back to the 1860s, and the U.S. government and the Gas Research Institute conducted critical research in the 1970s, the modern shale gas boom was made possible by a new hydraulic fracturing system developed by Mitchell Energy in 1997.
U.S. energy efficiency has also increased substantially since 1997. According to EIA, in 2011 the U.S. consumed 2,300 fewer Btu per dollar of GDP than in 1997—a 24% decline. These energy use reductions amount to the equivalent of 14.5 million barrels per day, dwarfing the increase in oil production and the amount of shale gas we currently produce.
Contributions Since 1997 of Domestic Oil, Gas and Energy Efficiency Resources to Meeting U.S. Energy Needs
Going forward, EIA, in their preliminary Annual Energy Outlook 2012, predicts that U.S. shale gas production will increase over the 2012–2035 period by the equivalent of nearly 3 million barrels of oil per day and U.S. oil production will increase by 0.24 million barrels per day. In comparison, a January 2012 study by ACEEE on long-term energy efficiency potential estimates that by 2050 we can use energy efficiency to reduce U.S. energy use, relative to a business-as-usual basecase, by 42–59%. In 2035, the midpoint estimate is equivalent to 17.5 million barrels per day. And this study further estimated that these energy efficiency investments and their resulting energy bill savings will support 1.3–1.9 million jobs in 2050.
So yes, recent increases in oil and gas production are important for the United States. But we should not lose sight of our #1 energy resource over the past several decades and well into the future: energy efficiency.
Posted on: Tuesday, June 19, 2012 7:08 AM
|The myth that Hansen’s 1988 prediction was wrong is one of those zombie myths that always keeps coming back even after you chop its head off time and time again. The newest incarnation of this myth comes from Jan-Erik Solheim, who in a 272 word article promoted by Fritz Vahrenholt and Sebastian Lüning (translated by the usual climate denial enablers here) manages to make several simple errors which we will detail here.
Whopping Wrong Temperature Change Claim
Solheim claims that "Hansen’s model overestimates the temperature by 1.9°C, which is a whopping 150% wrong." Yet Scenario A – the emissions scenario with the largest projected temperature change – only projects 0.7°C surface warming between 1988 and 2012. Even if emissions were higher than in Scenario A (which they weren’t, but Solheim wrongly claims they were), they would have to be several times higher for Hansen’s model to project the ~2.3°C warming over just 23 years (1°C per decade!) that Solheim claims. Solheim’s claim here is simply very wrong.
CO2 is Not the Only Greenhouse Gas
Quite similar to Patrick Michaels’ misrepresentation of Hansen’s study back in 1998, Solheim claims that Hansen’s Scenario A has been closest to reality by focusing exclusively on CO2 emissions. However, the main difference between the various Hansen emissions scenarios is not due to CO2, it’s due to other greenhouse gases (GHGs) like chlorofluorocarbons (CFCs) and methane (CH4), whose emissions have actually been below Scenario C (Figure 1). In fact, more than half of the Scenario A radiative forcing comes from non-CO2 GHGs.
Figure 1: Radiative forcing contributions from 1988 to 2010 from CO2 (dark blue), N2O (red), CH4 (green), CFC-11 (purple), and CFC-12 (light blue) in each of the scenarios modeled in Hansen et al. 1988, vs. observations (NOAA). Solheim claims the actual changes were larger than Scenario A (indicated by the blue arrow). In reality they were smaller than Scenario B.
Wrong on Temperature Data
Solheim also produces a very strange plot of what he claims is "the ultimate real-measured temperature (rolling 5-year average)." His plot shows the purported 5-year running average temperature around 1998 as hotter than at any later date to present, which is not true of any surface or lower atmosphere temperature data set. It appears that Solheim has actually plotted annual temperature data, or perhaps a 5-month running average, most likely from HadCRUT3, which has a known cool bias and has of course been replaced by HadCRUT4. There is simply no reason for Solheim to be using the outdated data from HadCRUT3.
Figure 2 shows what the comparison should look like when using the average of HadCRUT4, NASA GISS, and NOAA temperature data sets.
Figure 2: Hansen’s 1988 Scenario A (blue), B (green), and C (red) temperature projections compared to actual observed temperatures (black – average of NASA GISS, NOAA, and HadCRUT4) and to Solheim’s temperature plot (grey).
Ultimately Solheim’s concluded "The sorry state of affairs is that these simulations are believed to be a true forecast by our politicians." However, even if global climate models from several decades ago didn’t have the remarkable record of accuracy that they do, current day clmate modeling is far more sophisticated than that done by Hansen et al. nearly a quarter century ago. Climate models are now run on some of the world’s fastest supercomputers, whereas Hansen’s was run on a computer with substantially less computing power than a modern day laptop. While climate model forecasts are imperfect (as are all forecasts), they have thus far been quite accurate and are constantly improving.
What Can We Learn From This?
The observed temperature change has been closest to Scenario C, but actual emissions have been closer to Scenario B. This tells us that Hansen’s model was "wrong" in that it was too sensitive to greenhouse gas changes. However, it was not wrong by 150%, as Solheim claims. Compared to the actual radiative forcing change, Hansen’s model over-projected the 1984-2011 surface warming by about 40%, meaning its sensitivity (4.2°C for doubled CO2) was about 40% too high.
What this tells us is that real-world climate sensitivity is right around 3°C, which is also what all the other scientific evidence tells us. Of course, this is not a conclusion that climate denialists are willing to accept, or even allow for discussion. This willingness to unquestioningly accept something which is obviously simply wrong is a good test of the difference between skepticism and denial. Indeed, in misrepresenting Hansen’s results, Solheim has exhibited several of the 5 characteristics of scientific denialism.
Posted on: Monday, June 18, 2012 12:14 PM
Over the past few weeks, SkS articles about the "politics" of climate change have generated more comments than article about the "science" of climate change. A case in point is Andy S’s Scientific literacy and polarization on climate change. In this article, Andy discusses the findings of the paper, The polarizing impact of science literacy and numeracy on perceived climate change risks by Kahan et al published online by Nature Climate Change on May 27, 2012.
Toon of the Week
Source: Stephanie McMillan, Code Green.
Quote of the Week
Source: Rio+20: Concrete Goals the Only Recipe for Success by Stephen Leahy, IPS News, June 16, 2012
Issue of the Week
What is your primary source of information about what’s transpiring at the ogoing Rio+20 summit?
Scientific Term of the Week
Aerosols: A collection of airborne solid or liquid particles, with a typical size between 0.01 and 10 μm that reside in the atmosphere for at least several hours. Aerosols may be of either natural or anthropogenic origin. Aerosols may infl uence climate in several ways: directly through scattering and absorbing radiation, and indirectly by acting as cloud condensation nuclei or modifying the optical properties and lifetime of clouds (see Indirect aerosol effect).
Source: Annex I (Glossary) to Climate Change 2007: Working Group I: The Physical Science Basis, IPCC Fourth Assessment Report.
The Week in Review
A complete listing of the articles posted on SkS during the past week.
A list of articles that are in the SkS pipeline. Most of these articles, but not necessarily all, will be posted during the week.
SkS in the News
SkS Spotlights: The Alfred Wegener Institute
The Alfred Wegener Institute carries out research in the Arctic and Antarctic as well as in the high and mid latitude oceans. The institute coordinates German polar research and makes available to national and international science important infrastructure, e.g. the research ice breaker “Polarstern” and research stations in the Arctic and Antarctic.
Posted on: Monday, June 18, 2012 1:33 AM
|This is a reprint of a news release posted by the National Science Foundation (NSF) on May 21, 2012.
Researchers find that the global carbon pool in seagrass beds is as much as 19.9 billion metric tons
Dense seagrass meadows are a hallmark of the Florida Coastal Everglades LTER site.
Credit: Florida Coastal Everglades LTER Site
Posted on: Saturday, June 16, 2012 11:06 PM
|Note: Jan-Erik Solheim has just recently made some very incorrect claims about Hansen 1988, which we will debunk later this week. Consider this post a brief primer.
Earlier this year in a post Patrick Michaels Continues to Distort Hansen 1988, Part 1, we compared Patrick Michaels’ claims about Hansen et al. (1988) in his 1998 testimony before US Congress to reality. As Figure 1 shows, we found that Michaels had distorted reality, telling Congress that Hansen’s Scenario A was closest to reality, when in fact the actual 1988 to 1998 radiative forcing changes weren’t even quite as large as in Scenario C.
Figure 1: Radiative forcing contributions from 1988 to 1998 from CO2 (dark blue), N2O (red), CH4 (green), CFC-11 (purple), and CFC-12 (light blue) in each of the scenarios modeled in Hansen et al. 1988, vs. observations (NOAA).
Michaels had claimed Scenario A was accurate because at one point Hansen described it as "business as usual" (BAU). However, between 1988 and 1998 some major events occurred, such as passage of the Montreal Protocol international agreement to reduce chlorofluorocarbon (CFC) emissions, and the collapse of the Soviet Union. Thus, while it is debatable whether Scenario A truly represents a BAU scenario (we argued that it would be more accurate to describe Scenario B as BAU – see Figure 2 below), we did not follow a BAU path over this timeframe anyway. But more importantly, in terms of the greenhouse gas (GHG) radiative forcing (which is what Hansen’s model responded to), Scenario C was the closest to reality as of 1998.
However, this post focused primarily on the radiative forcings as of 1998, and only briefly touched on the up-to-date radiative forcing data (Figure 2).
Figure 2: Radiative forcing changes (1988 to 2010) for the three emissions scenarios in Hansen et al. 1988 (dark blue [A], red [B], and green [C]) vs. Skeie et al. (2011) GHG-only (light blue) and all anthropogenic forcings (purple), and business as usual (BAU) GHG based on a rate of increase consistent with the Skeie et al. estimate for 1978 to 1988 (gray, dashed).
We recently received a request to update Figure 1 to essentially break out the light blue curve in Figure 2 for the various individual GHGs. This update is shown in Figure 3.
Figure 3: Radiative forcing contributions from 1988 to 2010 from CO2 (dark blue), N2O (red), CH4 (green), CFC-11 (purple), and CFC-12 (light blue) in each of the scenarios modeled in Hansen et al. 1988, vs. observations (NOAA).
As Figures 2 and 3 show, the net GHG forcing has fallen smack dab between Scenarios B and C. The CO2 and N2O increases have been closest to Scenario B, whereas the methane and CFC increases have been closest to Scenario C, though even somewhat lower. The Montreal Protocol has been a major success, as the CFC increases over the past 22 years have been almost zero. In fact, the 2010 atmospheric CFC-11 concentration was actually slightly below its 1988 level.
As noted above, this analysis only considers long-lived GHGs. According to Skeie et al. (2011), the radiative forcings associated with ozone (another GHG) and land use change have also increased. The direct aerosol cooling effect also decreased during the 1990s. Therefore, the net 1988-2010 radiative forcing has increased at a rate closest to Scenario B, but approximately 16% lower, as illustrated by the purple curve in Figure 2.
In short, claims that actual emissions have followed a Scenario A path are wrong, and usually based on rhetoric (i.e. ‘Hansen said Scenario A was BAU’ – Michaels’ argument) or an undue focus on CO2 (i.e. ‘CO2 emissions have accelerated, as expected in Scenario A – the Solheim argument we will see later this week). In fact, CO2 concentrations and forcings don’t start to differ significantly between Scenarios A and B until after 2020. The main difference between the various scenarios, as Illustrated in Figure 3, is in CFCs and methane The real-world emissions of these GHGs have been quite low – even lower than in Scenario C.
Overall in order to evaluate which scenario has been closest to reality, we need to evaluate all radiative forcings. In terms of GHGs only, the result has fallen between Scenarios B and C. In terms of all radiative forcings, the result has fallen closest to and about 16% below Scenario B. Scenario A is the furthest from reality, which is a very fortunate result.
Posted on: Saturday, June 16, 2012 4:52 AM
|[crosspost from ClimateBites]
A conservative specialist in environmental law—Professor Jonathan Adler of Case Western Reserve University—lays out a thoughtful conservative approach to tackling climate change in a recent post at The Atlantic magazine.
Climate hawk David Roberts (Grist) accurately describes Adler’s piece as “an eloquent, principled case for the simple notion that ‘embrace of limited government principles need not entail the denial of environmental claims.’”
Adler suggests four policy changes to “make it cheaper and easier to adopt low-carbon technologies:” 1) prizes to spark innovation, 2) lower legal barriers to deployment, 3) a revenue-neutral carbon tax, and 4) adaptation.
Roberts notes, and most scientists would agree, that Adler understates the scale and urgency of the problem, cause and solutions. And no doubt, Adler—like Peter Wehner, Bob Inglis and a few others—is an outlier among today’s conservative leaders, for whom denying climate change has become a litmus test.
But Prof. Adler is clearly making, as he has for years, a serious attempt to grapple with the climate reality without abandoning conservative principles. Is there anything more important in climate politics today?
Adler’s short Atlantic article is worth reading in its entirety, as are some of his links below, for clues on how to speak effectively about climate to conservatives. Here’s the gist of his argument:
First, he makes the case, for skeptics, that global warming is real (the links are Adler’s; bold emphasis is mine):
Then Adler pivots to an interesting moral/legal case for climate action based on property rights.
Finally, Adler proposes four solutions, which, though no doubt insufficient, are creative and serious. Most interesting is his case for a carbon tax à la Hansen.
Interesting, no? Isn’t this the debate—how to solve the problem in a manner compatible with one’s values—that responsible leaders should be having? Adler’s piece is a good starting point for such a discussion—and offers at least a glimmer of hope for dialogue instead of a shouting match.
Posted on: Tuesday, June 12, 2012 6:04 PM
|One important thing in science is method development. Science works at the edge between known and unknown, and in order to reveal little bit more of unknown, it is quite often needed to improve our research methods and even come up with some new ones. That is because the studied issues, or at least some aspects of them, have not been known for long, and research methods developed originally to study something else might not be suitable for studying the new issue.
We have some studies this week, that are at least partially method development papers. There is a paper about a meeting of statisticians, mathematicians, and climate scientists, where they discussed how uncertainties should be quantified in climate observations. One paper makes an effort to determine surface air temperatures using satellite measurements. Ice core syncronisation is the subject of one paper. Speaking of ice cores, there’s another paper on ice cores which is borderline method development. Ice cores are used to study past climates but they have limited reach back in time. Currently longest ice core reaches back 800,000 years. Now researchers have studied ice flows in Allan Hills icefield and found out that there old ice has moved upwards, so old ice is there at the surface presenting possibility to extend ice core records beyond 800,000 years.
Other studies this week are touching the unknowns of carbon cycle, temperatures in European Alps, atmospheric carbon dioxide effects, Greenland glaciers, Southern Ocean wind, climate change scepticism, tropical and African rainfall, and atmospheric methane.
Posted on: Monday, June 11, 2012 11:41 PM
|A little over a year ago we reported on the success of the the Regional Greenhouse Gas Initiative (RGGI), which is a carbon cap and trade system implemented by ten northeastern states in the USA (Connecticut, Delaware, Maine, Maryland, Massachusetts, New Hampshire, New Jersey, New York, Rhode Island, and Vermont; New Jersey has since dropped out) which set the goal of reducing their carbon dioxide (CO2) emissions from the power sector by 10% by 2018.
Through the first two years of the system, the ten states had generated $789 million through the auctioning and direct sale of CO2 emissions allowances. Each state developed its own plan for investing those funds, but overall, 52% was used for energy efficiency programs, 14% for energy bill payment assistance, including assistance to low-income ratepayers, and 11% to accelerate deployment of renewable energy technologies. New York, New Hampshire, and New Jersey also diverted some of the funds to reduce their state budget deficits.
A year later, we have another RGGI update. The states have far exceeded their emissions reduction target, with a 23% overall reduction in 2009-2011 power plant CO2 emissions as compared to the 2006-2008 average, already achieving more than twice the emissions reduction goal, six years ahead of schedule. Low natural gas prices have helped the power plants transition away from coal combustion, thus helping them surpass the RGGI targets.
But at What Cost?
However, when I attended Christopher Monckton’s presentation to California policymakers in my hometown, I was assured that carbon cap and trade systems would absolutely cripple the economy. Monckton told the audience that California’s cap and trade system (whch to be fair, is more ambitious than RGGI) would cost the state hundreds of billions of dollars and drive businesses and jobs out of California.
The Koch-funded Americans for Prosperity similarly claimed that RGGI would lead to "higher taxes, lost jobs, and less freedom" in addition to a doubling of electric rates.
This is in fact the standard contrarian argument against carbon pricing systems – that they will cripple the economy, drive up electric rates, and scare businesses away. If this argument is true, then more than three years after RGGI implementation, surely we should have seen these effects in play, with plummeting gross state product (GSP – the state equivalent of gross domestic product) and skyrocketing unemployment and electricity prices in the RGGI states. So, how does the contrarian argument stack up against reality?
RGGI State Economies Fare Better than the National Average
One tricky aspect in evaluating the economic impact of the RGGI system is that it was implemented right at the start of the current major economic recession, in 2008-2009. Thus GSP has fallen and unemployment has indeed risen in the RGGI states, but it has also risen all across the United States. The easiest way to try and take the effects of the economic recession into account is to compare the average GSP and unemployment changes in the RGGI states vs. the average changes nationwide.
Table 1 examines unemployment data, Table 2 examines GSP, and Table 3 examines electricity rates for the RGGI states vs. the national average.
Table 1: January 2008 and December 2011 unemployment statistics. Data from U.S Department of Commerce Bureau of Economic Analysis
Table 2: Average percent annual GSP growth from 1997 to 2007 (pre-RGGI and recession) and 2008-2011 (post-RGGI and recession). Data from U.S. Bureau of Labor Statistics and available for plotting at Google Public Data.
Table 3: Electricity rates (total price), average for 2005-2007 (pre RGGI) and 2008-2011 (post-RGGI). Data from the U.S. Energy Information Administration.
As Tables 1 and 2 show, the RGGI states on average have weathered and begun recovering from the economic recession better and faster than the national average in terms of both GSP and unemployment. In fact, the only RGGI state to fare worse than the national average in terms of unemployment is Rhode Island, and only Rhode Island, Maine, and Maryland have experienced larger GSP declines than the national average over the past three years. 67% of RGGI states have beat the national average in terms of GSP, and 89% have done better in terms of employment.
Table 3 shows that while there is a fairly wide variation between the various RGGI states, on average their electricity rates have not risen faster than the national average over the past three years. In fact, the rates in only three of the nine RGGI states rose faster than the national average over this period.
There certainly has not been the predicted plummeting GSP or skyrocketing electricity prices or businesses and jobs fleeing the states participating in this carbon pricing system. This real-world example shows that claims that carbon pricing systems will cripple the economy are unfounded alarmism. In reality they are an economically effective way of reducing greenhouse gas emissions.
Posted on: Wednesday, June 06, 2012 12:02 AM
|We often hear claims from climate contrarians that climate scientists are guilty of what they describe as "pal review." The conspiracy theory goes something like this – climate scientists conduct biased research with the goal of confirming the human-caused global warming theory. They then submit their biased results to a peer-reviewed journal with friendly editors ("pals") who pass their paper along to friendly reviewers (other "pals") who give their fraudulent work the green light for publication. Thus, the contrarians argue, the preponderance of peer-reviewed literature supporting human-caused global warming is really just a sign of corruption amongst climate scientists.
However, while climate contrarians are never able to produce any evidence to support their conspiracy theory, John Mashey has thoroughly documented a real world example of true pal review. Contrary to the standard conspiracy theory, the pal review did not involve mainstream climate scientists, but instead the climate contrarians themselves.
The True Story of Climate Research Pal Review
Mashey has done an excellent job documenting a real life case of pal review, which happened at the journal Climate Research between 1997 and 2003. That particular journal was once again brought to the forefront in the recent second Climategate stolen email release.
In those emails, various climate scientists had expressed concern that Climate Research was publishing shoddy papers by a small group of climate contrarians, and discussed what they could do about it. The most infamous of these papers was one by Soon and Baliunas (2003) which concluded that current global temperatures are not anomalous compared the past 1,000 years. After publishing this paper, Soon was invited by Senator James Inhofe to testify before US Congress, and the Soon and Baliunas paper was used by Congressional Republicans to justify opposition to climate legislation.
However, the paper contained numerous major fundamental flaws, such as equating dryness with hotness, and was subsequently roundly refuted by an article in the American Geophysical Union journal Eos written by a number of prominent climate scientists. This paper, and Climate Research‘s refusal to revise or retract it, led to the resignation of five of the journal’s editors, including recently-appointed editor-in-chief Hans von Storch, who explained the reason for his resignation:
In short, the journal’s chief editor voiced the exact same concerns as the climate scientists in the Climategate 2 emails – that certain Climate Research editors were systematically publishing methodologically flawed papers in their journal. Soon and Baliunas were far from the only climate contrarians to benefit from the journal’s friendly editorial policy. In fact, the biggest pal review beneficiary bears a very familiar name.
Patrick Michaels and Pals
Prior to Hans von Storch’s promotion to Climate Research editor-in-chief in 2003, the journal did not have a chief editor, and so authors sent their manuscripts to an Associate Editor of their choice. One particular Associate Editor, Chris de Freitas, published 14 separate papers from a select group of 14 climate contrarians during the 6 year period of 1997 to 2003:
As Mashey shows, from 1990 to 1996, Climate Research published zero papers from this group. From 1997 to 2003, the journal published 17 papers from this group, 14 with de Freitas as the Associate Editor. Serial data deleter Patrick Michaels was an author on 7 of the 14 pal reviewed papers, which also accounted for half of his total peer-reviewed publications during this timeframe. During this period, 14 of the 24 (58%) papers accepted by de Freitas came from this group of contrarians. After von Storch’s resignation in 2003, de Freitas published 3 more papers from authors outside this group before leaving the journal in 2006.
Another on the list of ‘pals’, Robert Davis, was another Associate Editor at Climate Research who accepted 36 papers during his tenure, two of which were co-authored by another pal, Robert Balling. The journal also published 5 other papers from this group by non-pal editors. However, in total, at least 16 of the 21 (76%) of the papers published by Climate Research which were authored by this group of climate contrarians had pal review editors, mostly de Freitas (67%) during this six year window.
After von Storch’s resignation, Mashey documents that the pals’ Climate Research publications dried up. Davis accepted one of Balling’s papers submitted in 2004, and papers co-authored by Balling and by de Freitas were published by the journal in 2008 (Table 1). 18 of the 21 (86%) of the 15 pals’ Climate Research publications were submitted in the 1997 to 2003 timeframe.
Table 1: Climate Research publications grouped by Associate Editor. Grey bars show approximate editor tenure as derived from received dates of papers. The "pals" papers are shown in red capitals, 14 accepted by de Freitas (bold), and 7 handled by others (red, underlined italics). De Freitas also accepted 13 seemingly normal papers from other authors (lowercase black).
Mashey also finds that the 15 ‘pals’ were closely connected in climate contrarian activities outside of Climate Research as well, for example working for various anti-climate think tanks, most being connected with either Fred Singer or Patrick Michaels.
There is also substantial overlap with the pals joining together to author these papers (Figure 1).
Figure 1: Overlap between pal authors of the 14 de Freitas Climate Research pal review publications between 1997 and 2003. The node numbering represents the Climate Research volume and page number of the pal publications, while the node connections represent papers written by the same pal authors (i.e. 9.3p14 and 23.1p15 were both authored by Michaels and Knappenberger). Image by jg and Kevin C.
The Purpose of the Mainstream Pal Review Myth
For those who oppose the prudent path forward with regards to climate change, which involves major global greenhouse gas emissions reductions, the scientific consensus on human-caused global warming is a very inconvenient thing. Despite the public relations damage resulting from Climategate, people still trust climate scientists’ opinions about climate science (although political conservatives’ trust in scientists in general has declined). However, much of the public (at least the American public) doesn’t realize that there is a scientific consensus on human-caused climate change. Polls in October 2010 and September 2011 found that 44% and 37% of the American public believes that scientists are divided regarding the cause of global warming, respectively.
According to the March 2012 George Mason Center for Climate Change Communication (CCCC) national poll, climate scientists are the most trusted source for climate science information, with 74% of public trust (Figure 2). However, a large segment of the population believes there is a major scientific debate on the subject, no doubt thanks to the false media balance which gives the ~3% minority of experts who think humans aren’t the dominant cause of the current climate change (and their non-expert surrogates) ~50% of the media attention. Therefore, many people don’t believe that humans are the primary cause of global warming (approximately 41% of Americans).
Figure 2: Responses to the George Mason CCCC poll question "How much do you trust or distrust the following as a source of information about global warming?"
The numbers reveal a stark picture: 76% of Americans trust climate scientists, but 41% think scientists are divided on the causes of the warming, and 41% think the observed warming is mostly natural.
Thus as Ding et al. (2011) concluded, if a larger percentage of people realized that there is a scientific consensus on the issue amongst the group they trust most on the subject (and rightly so), more people would believe that humans are causing global warming, and more people would demand that we do something about it. The lack of public awareness of the scientific consensus on human-caused climate change is one of the biggest obstacle to taking climate mitigation action.
For this reason, climate contrarians have attacked the scientific consensus from many different angles. Some have tried to attack the credibility of the many different surveys and studies documenting the consensus. Others simply ignore this documentation and deny the consensus exists at all.
The third group, discussed in this post, attacks the credibility of the consensus itself, claiming it’s all part of a massive fraudulent conspiracy of thousands of corrupt climate scientists (note that conspiracy theories are one of the five characteristics of scientific denialism). Ironically, this conspiracy theory has been most recently voiced by pal review beneficiary Patrick Michaels.
Michaels of course provides no evidence whatsoever to support this conspiracy theory of peer-review corruption. He expects us to swallow his tale of "pal review" – the conspiracy theory that thousands of climate scientists are publishing thousands of biased papers every year in order to keep the human-caused global warming theory propped up – based on nothing more than his say-so.
While Michaels is indeed something of an expert on the subject, his expertise comes from himself being one of the individuals most guilty of engaging in climate research pal review.
Pal Review Summary
While Patrick Michaels has accused mainstream climate scientists of a vast conspiracy involving pal review (and exposed his own characteristic of scientific denialism in the process) without any substantiation or supporting evidence, in reality Patrick Michaels himself was the biggest beneficiary in the one actual demonstrated case of climate science pal review, as documented by Mashey.
A group of 14 climate contrarians found a sympathetic journal editor who proceeded to publish a large number of papers from this group over a very short timeframe, many of which were scientifically flawed, some of which were subsequently used by politicians to oppose climate legislation.
Ironically, the climate scientists who tried to do something about this problem have themselves been accused of trying to "hijack" or "subvert" the peer-review process. And of course the guiltiest party of all, Patrick Michaels has accused thousands of climate scientists of the sort of pal review he himself engaged in.
Our tale is one of irony, hypocrisy, and projection. The next time you see a complaint about the fairy tale of rampant climate science "pal review", direct the accuser to John Mashey’s documentation of a pal review true story.
Note: this post has been used as the Intermediate rebuttal to the myth Climate science peer review is pal review.