Analysis

BEST OF 2013, POLEX: “Economist” response, Verchot and Angelsen

CIFOR scientists question recent Economist article on global air temperature
Shares
0
Satellite Image. Digital Globe

Related stories

Editor’s Note: CIFOR’s wide-ranging research in 2013 made it a busy year for Forests News. From Brazil nuts to board games; from traditional knowledge helping Africa to smoke choking Southeast Asia; from photogenic leopards to edible insects, CIFOR brought new data, insight and analysis to a host of forest-related issues across the landscape. (Did we mention “landscapes”?)

When The Economist magazine asserted that the air temperature of the planet has been flat for the past 15 years despite a steady increase in the greenhouse gas concentration of the atmosphere, CIFOR’s Lou Verchot and Arild Angelsen, experts in forests and climate change, swiftly disagreed. Rebutting both the claim that global temperature has stabilized and the policy conclusions drawn from it, Verchot and Angelsen disputed the Economist’s notion that humanity could simply “adapt” to climate change, writing in April that to focus on one parameter of climate change, as the Economist article did, is a poor basis for making policy.

As the year winds to an end, we are revisiting the most popular blog articles authored by CIFOR scientists in 2013. 

BOGOR, Indonesia (16 April, 2013) – A recent article in The Economist magazine asserted that the air temperature of the planet has been flat for the past 15 years despite a steady increase in the greenhouse gas concentration of the atmosphere.

The authors find the lack of warming surprising, given the large volume of greenhouse gases that were pumped into the atmosphere between 2000 and 2010.

They then provided a lucid explanation of climate sensitivity and suggested that the apparent pause in temperature increase may mean that a doubling of atmospheric CO2 concentration could lead to a lower than expected temperatures and that as a result, perhaps social and environmental policy should focus on adaptation rather than mitigation.

We question both the claim that global temperature has stabilized and the policy conclusions drawn.

Scientists always struggle with inflection points in trends. It is difficult to predict when a dramatic change in a trend is likely to occur, and normally it can only be done long afterwards.

Society asks scientists to make reasonable guesses about the future, whether it is in markets, disease spread, or rates of deforestation.

When the past is no longer a good predictor of the future, our ability to do this is weak in all fields. However, we still need to make economic policy, health policy and environmental policy. We have to accept that no matter how good our science is, the information about the future will always be imperfect, and that we have to act based on imperfect information.

Is the global temperature increase slowing down?

There is not one, simple measure of the global temperature, but well-respected datasets are the HADCRUT4 and NASA’s GISS.

Interested readers should have a closer look these datasets. They have different approaches to filling in spatial gaps in the temperature record, but they both underscore the same point: one needs to look at the long-term trend to understand changes in global temperature, not just the last 30 years.

The figure below presents the HADCRUT4 dataset used in The Economist article to draw their policy conclusion.

Graph shows the HADCRUT4 dataset as it appeared in the Economist
Figure 1: The HADCRUT4 dataset.

Temperatures were relatively steady between 1850 and 1900 (black line). There was a reasonably robust increase in temperature between 1900 and 1940 (red line), a 30-year hiatus (even a slight negative trend) between 1940 and 1970 (green line), followed by more than 30 years of very steep increases in temperature (blue line).

Some argue that there was an inflection point in the 30-year trend around 2006. Others suggest this happened earlier because of the very high temperature anomaly measured in 1998 (a year with the strongest El Niño of the century). One could argue either way, but it is not clear that we can detect a slowing signal against the background noise.

For us, the take home message is that 8 out of the last 10 years have been the warmest on record. The 2000s were much warmer than the 1990s, and (even since 1970) there have been periods of faster and slower growth in surface air temperatures.

We had a period in the 1990s where it looked like the best scientific estimate of climate sensitivity to CO2 was below the actual rate of temperature increase; we have a period in the 2010s where it looks like the best scientific estimate is too high.

So while the temperature curves drawn by Intergovernmental Panel on Climate Change (IPCC) — a scientific body that provides comprehensive scientific assessments about the risk of climate change — tend to be smoother than reality, we really still do not fully know the future and we cannot predict the next temperature turning point.

However, the probability of an increase is much greater than that of a downturn. Our best predictions about the future come from the climate models. A recent study in Nature Geoscience suggests that the early models have done a good job in predicting the recent temperature change.

The perils of focusing on one parameter of climate change

Science continuously brings new knowledge, which helps to reduce uncertainty of predictions. Some of these new findings paint a grimmer picture of our future climate, while others will do the opposite. Nothing would make us happier than if it turns out that the worst case, or even medium case prediction by IPCC becomes more unlikely and does not happen.

But, throwing out long-term trends based on a few observations and making misleading interpretations are dangerous. The Economist article makes two principal errors in its analysis to draw its policy relevant conclusion.

First, the analysis is limited to one type of data – surface air temperature. An article by the American Geophysical Union noted: “Many researchers and professionals work with data from outside their core field of training and lack the knowledge, time, and motivation to thoroughly survey and synthesize the literature and documentation on unfamiliar data sets.”

Focusing on one parameter of climate change, as the article did, is inappropriate for making policy recommendations.

The heat content of the oceans continues to increase, oceanic circulation patterns continue to change, glaciers continue to melt, etc. In fact, the majority of the excess heat in the earth system can be found in the oceans, not the atmosphere. Many of the more damaging aspects of climate change are not captured in measures of surface air temperature.

Second, the article wrongly focuses on the effects of doubling atmospheric CO2. While this is a useful measure of climate sensitivity, there is nothing in the emission trend that suggests that our emissions will stop when atmospheric concentrations reach this level.

There are several scenarios on the table where we hit 2.5 times preindustrial CO2 concentrations by the end of the century. So avoiding emissions reductions in favor of simply adapting to changes is not really an option and it is unfortunate that the article suggests that it is.

So what should scientists be telling climate policy makers?

It is clear that there is uncertainty in the temperature sensitivity of the climate system to greenhouse gases; science has never hidden that fact. Furthermore, we know that the environmental impact of CO2 goes beyond just climate: ocean acidification from rising CO2 levels can cause serious problems for these ecosystems.

We do not think the proposed inflection provides a basis for policy just yet, and it should also not be used as an argument for doing nothing. Scientists still need to tell policy makers that while we should welcome any slowdown it as it gives us some space to put in place measures to limit future damage; we still must put those measures in place.

Current global efforts to curb climate change are lagging seriously behind the schedule needed to get on a 2 degree path. Even if the next one or two decades does demonstrate that the global warming is slower than we anticipated, it would only serve to reduce the gap between the political reality and the climate necessity. The risk of overshooting in current climate mitigation efforts is minuscule.

The saying goes that “an ounce of prevention is worth a pound of cure”. Legal systems embody the principle of “duty of care” that establish responsibility when negligent behavior causes damage to society or to third parties.

In international policy we have the “precautionary principle”, which says that when activities raise threats to human health or the environment, mitigation action is warranted even in the absence of certainty of all of the cause-and-effect relationships. These ideas are at the heart of many of the policy positions of developing countries and of the international discourse on climate justice.

The science is settled enough to warrant action to mitigate climate change. Human activity is changing the climate, this is causing harm and the harm will increase if we do nothing.

We have principles embodied in common sense, national law, and international agreements. Given the potential for damage to the climate system and for ensuing negative impacts on people, the conclusion of The Economist article appears reckless.

The world should apply the precautionary principle and limit emissions.

Copyright policy:
We want you to share Forests News content, which is licensed under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0). This means you are free to redistribute our material for non-commercial purposes. All we ask is that you give Forests News appropriate credit and link to the original Forests News content, indicate if changes were made, and distribute your contributions under the same Creative Commons license. You must notify Forests News if you repost, reprint or reuse our materials by contacting forestsnews@cifor-icraf.org.