Making Sense of Sensitivity … and Keeping It in Perspective

Climate ‘skeptics’ down-play the sensitivity of Earth’s climate to increased CO2 emissions and concentrations, and so might some policy makers. In the end, it’s the emissions and concentrations that most matter rather than uncertainties about climate sensitivity.

Climate sensitivity is suddenly a hot topic.

Some commenters skeptical of the severity of projected climate change have recently seized on two sources to argue that the climate may be less sensitive than many scientists say and the impacts of climate change therefore less serious: A yet-to-be-published study from Norwegian researchers, and remarks by James Annan, a climate scientist with the Japan Agency for Marine-Earth Science and Technology (JAMSTEC).

While the points skeptics are making significantly overstate their case, a look at recent developments in estimates of climate sensitivity may help provide a better estimate of future warming. These estimates are critical, as climate sensitivity will be one of the main factors determining how much warming the world experiences during the 21st century.

Climate sensitivity is an important and often poorly understood concept. Put simply, it is usually defined as the amount of global surface warming that will occur when atmospheric CO2 concentrations double. These estimates have proven remarkably stable over time, generally falling in the range of 1.5 to 4.5 degrees C per doubling of CO2.* Using its established terminology, IPCC in its Fourth Assessment Report slightly narrowed this range, arguing that climate sensitivity was “likely” between 2 C to 4.5 C, and that it was “very likely” more than 1.5 C.

The wide range of estimates of climate sensitivity is attributable to uncertainties about the magnitude of climate feedbacks (e.g., water vapor, clouds, and albedo). Those estimates also reflect uncertainties involving changes in temperature and forcing in the distant past. But based on the radiative properties, there is broad agreement that, all things being equal, a doubling of CO2 will yield a temperature increase of a bit more than 1 C if feedbacks are ignored. However, it is known from estimates of past climate changes and from atmospheric physics-based models that Earth’s climate is more sensitive than that. A prime example: Small perturbations in orbital forcings resulting in vast ice ages could not have occurred without strong feedbacks.

Water Vapor: Major GHG and Major Feedback

Water vapor is responsible for the major feedback, increasing sensitivity from 1 C to somewhere between 2 and 4.5 C. Water vapor is itself a powerful greenhouse gas, and the amount of water vapor in the atmosphere is in part determined by the temperature of the air. As the world warms, the absolute amount of water vapor in the atmosphere will increase and therefore so too will the greenhouse effect.

That increased atmospheric water vapor will also affect cloud cover, though impacts of changes in cloud cover on climate sensitivity are much more uncertain. What is clear is that a warming world will also be a world with less ice and snow cover. With less ice and snow reflecting the Sun’s rays, melting will decrease Earth’s albedo, with a predictable impact: more warming.

There are several different ways to estimate climate sensitivity:

  • Examining Earth’s temperature response during the last millennium, glacial periods in the past, or periods even further back in geological time, such as the Paleocene Eocene Thermal Maximum;
  • Looking at recent temperature measurements and data from satellites;
  • Examining the response of Earth’s climate to major volcanic eruptions; and
  • Using global climate models to test the response of a doubling of CO2 concentrations.

These methods produce generally comparable results, as shown in the figure below.


Figure from Knutti and Hegerl 2008.

The grey area shows IPCC’s estimated sensitivity ranges of 2 C to 4.5 C. Different approaches tend to obtain slightly different mean estimates. Those based on instrumental temperature records (e.g., thermometer measurements over the past 150 years or so) have a mean sensitivity of around 2.5 C, while climate models average closer to 3.5 C.

The ‘Sting’ of the Long Tail of Sensitivity

Much of the recent discussion of climate sensitivity in online forums and in peer-reviewed literature focuses on two areas: cutting off the so-called “long tail” of low probability\high climate sensitivities (e.g., above 6 C or so), and reconciling the recent slowdown in observed surface warming with predictions from global climate models.

Being able to rule out low-probability/high-sensitivity outcomes is important for a number of reasons. For one, the non-linear relationship between warming and economic harm means that the most extreme damages would occur in very high-sensitivity cases (as Harvard economist Marty Weitzman puts it, “the sting is in the long tail” of climate sensitivity). Being able to better rule out low probability/high climate sensitivities can change assessments of the potential economic damages resulting from climate change. Much of the recent work arguing against very high-sensitivity estimates has been done by James Annan and Jules Hargreaves.

The relatively slow rate of warming over the past decade has lowered some estimates of climate sensitivity based on surface temperature records. While temperatures have remained within the envelope of estimates from climate models, they have at times approached the 5 percent to 95 percent confidence intervals, as shown in the figure below.


Figure from Ed Hawkins at the University of Reading (UK).

However, reasonably comprehensive global temperature records exist only since around 1850, and sensitivity estimates derived from surface temperature records can be overly sensitive to decadal variability. To illustrate that latter point, in the Norwegian study referred to earlier, an estimate of sensitivity using temperature data up to the year 2000 resulted in a relatively high sensitivity of 3.9 C per doubling. Adding in just a single decade of data, from 2000 to 2010, significantly reduces the estimate of sensitivity to 1.9 C.

There’s an important lesson there: The fact that the results are so sensitive to relatively short periods of time should provide a cautionary tale against taking single numbers at face value. If the current decade turns out to be hotter than the first decade of this century, some sensitivity estimates based on surface temperature records may end up being much higher.

So what about climate sensitivity? We are left going back to the IPCC synthesis, that it is “likely” between 2 C and 4.5 C per doubling of CO2 concentrations, and “very likely” more than 1.5 C. While different researchers have different best estimates (James Annan, for example, says his best estimate is 2.5 C), uncertainties still mean that estimates cannot be narrowed down to a far narrower and more precise range.

Ultimately, from the perspective of policy makers and the general public, the impacts of climate change and the required mitigation and adaptation efforts are largely the same in a world of 2 or 4 C per doubling of CO2 concentrations where carbon dioxide emissions are rising quickly.

Just how warm the world will be in 2100 depends more on how much carbon is emitted into the atmosphere, and what might be done about it, than on what the precise climate sensitivity ends up being. A world with a relatively low climate sensitivity — say in the range of 2 C — but with high emissions and with atmospheric concentrations three to four times those of pre-industrial levels is still probably a far different planet than the one we humans have become accustomed to. And it’s likely not one we would find nearly so hospitable.

Editor’s Note: This piece uses Centigrade rather than Fahrenheit temperature units. The C units in this feature translate to the following F units: 1.5C=2.4F; 4.5C=8.1F;2C=3.6F;3.5C=6.3F;6C=10.8F; and 3.9C=7.02F.

Zeke Hausfather

Zeke Hausfather, a data scientist with extensive experience with clean technology interests in Silicon Valley, is currently a Senior Researcher with Berkeley Earth. He is a regular contributor to The Yale Forum (E-mail: zeke@yaleclimatemediaforum.org, Twitter: @hausfath).
Bookmark the permalink.

22 Responses to Making Sense of Sensitivity … and Keeping It in Perspective

  1. Paul Quigg says:

    Climate sensitivity is one of many uncertainties relevent to future warming. Water has always been the weak link in all studies. It refuses to cooperate in model simulations and scientists constantly complain, “It is somewhat unsettling that the results of a complex climate model can be so drastically altered by substituting one reasonable cloud parameterization for another”. Another uncertainty is the actual global temperature in 1850, estimates are over a range of .6 degrees C. Early climate models only considered the atmosphere and later coupled models had a tough time including the ocean’s sequestration. Including the other three elements which effect our climate add further complications and uncertainties.
    We really know very little about the climate and we are attempting to make very expensive decisions based on uncertainties piled on uncertainties

  2. Bishop Hill says:

    The problem with this article is that it reviews a paper that is five years old. The science has moved on. There have been a series of empirical and semi-empirical papers that have found low climate sensitivity (Forster and Gregory, Ring et al, Aldrin et al, as well as the Cicero study mentioned) since Knutti and Hegerl reviewed the field. We know that many of the others are given a warm bias because of their use of uniform priors.

  3. Bishop Hill says:

    The claim that

    Ultimately, from the perspective of policy makers and the general public, the impacts of climate change and the required mitigation and adaptation efforts are largely the same in a world of 2 or 4 C per doubling of CO2 concentrations where carbon dioxide emissions are rising quickly.

    is also highly suspect. At 2 degrees, the cost of almost every mitigation policy in place today would be greater than the putative cost of emissions, by quite a long way.

  4. John Garrett says:

    Repeat after me: We don’t know.
    Again: We don’t know.

    Maybe it’s time to stop yelling, “The sky is falling.”

  5. Nic Lewis says:

    You say
    ” the Norwegian study referred to earlier, an estimate of sensitivity using temperature data up to the year 2000 resulted in a relatively high sensitivity of 3.9 C per doubling. Adding in just a single decade of data, from 2000 to 2010, significantly reduces the estimate of sensitivity to 1.9 C”

    That is a misconception. The Skeie et al. study referred to (as originally submitted) estimates, using data extending to 2010, that climate sensitivity is most likely to be 1.7 deg. C. Using the same data but ending in 2000, the corresponding estimate is 2.0 deg. C. You can see this if you look at where the peak of the PDF curve is in the top two graphs in slide 6 at http://www.uib.no/People/ngfhd/EarthClim/Calendar/Oslo-2012/ECS_Olavsgard.pdf . Further, I estimate that the 2.0 deg. C figure would have been only 1.8 or 1.9 deg. C had a noninformative prior, rather than a uniform prior, been used in the Bayesian estimation methodology.

    The 1.9 and 3.9 (should be 3.8?) deg. C figures are mean values, not most likely estimates. A mean is not really a suitable measure for the central value of skewed distributions. The increase from 1.9 to 3.9 deg. C almost entirely reflects the PDF based on data only up to 2000 being far worse constrained than when using data up to 2010. Since climate sensitivity PDFs, for good physical/mathematical reasons, normally have tightly constrained lower tails but much worse constrained upper tails, as the PDF becomes worse constrained its mean will progressively increase even if the most likely (best-fit: the mode) estimate – indeed the median estimate – remains unchanged. This effect is particularly strong when using a uniform prior, which typically distorts climate sensitivity PDFs by incorrectly fattening their upper tails.

    If you look at the second graph down on the LHS of slide 9, which uses three rather than one ocean heat content estimates with data ending in 2000, improving how well the climate estimate is constrained, you will see that the mean comes down from 3.8 to 2.2 deg. C even though the most likely estimate does not reduce much.

  6. MSchopp says:

    “James Annan, for example, says his best estimate is 2.5 C”

    Could you provide a quote for this ? I think this was his estimate before new data, such as massive decrease of aerosols forcing, increase of black carbon warming, temperatures not rising. He has alway been critical of the biased Bayesian mathematics of several observational papers in AR4, another reason to suggest an estimate of 2.5 is outdated and now lower.

  7. Bishop Hill,

    There have been a number of papers on sensitivity over the last five years, both on the high and low sides. I’d argue that the papers you mention (Forster and Gregory, Ring et al, etc.) still fit pretty solidly among the instrumental-based reconstructions in Knutti and Hegerl, which estimate around 1.5 C to 3.5 C. Other recent papers with higher sensitivity estimates include Hansen and Sato (2011) (up to 5 C), Fasullo and Trenberth (2012) (likely > 3 C), PALAEOSENS (2012) (2.2 to 4.8 C), Hargreaves et al 2012 (0.5 C to 4 C), Köhler et al. (2010) (1.4 C to 5.2 C). I would agree that recent work might support a “likely” lower bound of closer to 1.5 C than the previous 2 C.

    As far as policy goes, I’d argue that a carbon price of $18 per ton that you associated with 2 C sensitivity would still be much better than the current U.S. carbon price (effectively $0). While sensitivity will certainly help fine-tune the magnitude of the response, it doesn’t change the fact that in either the 2 C or 4 C case we still would need a price on carbon significantly higher than todays.

    Nic Lewis,

    I’d somewhat contest your assertion that the peak of the PDF rather than its mean is most meaningful, as long tails can have significant impacts from a policy/economics perspective (as Weitzman is fond of saying, the sting is in the tail). Also, the PDF you link frames the up to 2000 results as “in agreement with IPCC AR4″, while the up to 2010 results are on the low side. This bolsters my contention that instrumental-based sensitivity estimates could be highly sensitive to decadal variability.

    Overall, I think the take aways of recent work are that lower sensitivity is not out of the question (e.g. 1.5 C is probably within the range of estimates), and that the long tail of sensitivity estimates have been somewhat curtailed. Both of these are good news. That said, it doesn’t substantively change the 2 to 4.5 (or 1.5 to 4 C) estimates from the IPCC, and there remains considerably uncertainty regarding what sensitivity actually is.

  8. Nic Lewis says:

    Zeke
    What I wrote was “A mean is not really a suitable measure for the central value of skewed distributions”. I don’t disagree that a measure of the tail sensitivity values is needed, but I think that should be kept separate from the central estimate. I regard cutting the tail extent down as important; it becomes easier to do so as the signal-to-noise ratio improves over time.

    Maybe Skeie et al. reckoned (probably correctly) that stating that results “using data only up to 2000 gives results in agreement with IPCC AR4″.

    I would say that recent work (including one or two accepted studies that haven’t yet come out as well as studies like Alrin 2012, Ring 2012 and Skeie) rule out ECS below 1.0 C, but not ECS below 1.5 C.

  9. Nic Lewis says:

    Sorry, I failed to complete my 2nd sentence, which should have read:

    Maybe Skeie et al. reckoned (probably correctly) that stating that results “using data only up to 2000 gives results in agreement with IPCC AR4″ might improve the prospects for less hostile reviews of their paper, and therefore for its timely acceptance.

    • MSchopp says:

      I don’t think, that quote supports your case, but rather the opposite.

      The timeline is IMHO

      1. Annan’s 2006 paper, the corresponding blog title: “Climate sensitivity is 3C”
      http://julesandjames.blogspot.jp/2006/03/climate-sensitivity-is-3c.html

      2. An unspecified reduction of the central estimate with the 2009 paper, blog title “Uniform prior: dead at last!”
      “We show that the popular choice of a uniform prior has unacceptable properties and cannot be reasonably considered to generate meaningful and usable results. When instead reasonable assumptions are made, much greater confidence in a moderate value for S is easily justified, with an upper 95% probability limit for S easily shown to lie close to 4C, and certainly well below 6C. These results also impact strongly on projected economic losses due to climate change.”
      http://julesandjames.blogspot.jp/2009/09/uniform-prior-dead-at-last.html

      3. Quote in 2013:
      “The discerning reader will already have noted that my previous posts on the matter actually point to a value more likely on the low side of this rather than higher, and were I pressed for a more precise value, 2.5 might have been a better choice even then.”

      My interprestation is, that 2.5C might have been a better choice “even then” in 2009. But now, with the massive changes in aerosol forcing, black carbon warming and the continuing pause in any temperature increase, it has to be lower.

      However, my personal opinion is, that Annan does currently not want to commit to any specific new value, without his own new paper/computation based on new data.

      • Toby says:

        Dr Annan may have said it in his legendary interview with David Rose of the Daily Mail, which appeared recently in the newspaper.

        Perhaps Mr Rose could make his notes available ….. ?

      • PeteB says:

        Mschopp – above you missed the begining of Annan’s quote, which I think undermines your argument

        “Yeah, I should probably have had a tl;dr version, which is that sensitivity is still about 3C.”

  10. Sundance says:

    Zeke if the measured values in the Hawkin’s graph fall outside the 5-95% area how long must they remain outside to consider the 38 model composite inaccurate?

    • dhogaza says:

      “My interprestation is, that 2.5C might have been a better choice “even then” in 2009. But now, with the massive changes in aerosol forcing, black carbon warming and the continuing pause in any temperature increase, it has to be lower.”

      No, that’s not what he’s saying at all.

      He’s saying that he thinks 2.5C is more likely than 3C given the current state of knowledge, and that maybe that would’ve been a better figure even back in 2009.

      He’s also been clear that he’s not happy with people trying to spin his position into something it is not.

      You could, of course, go over there and put words into his mouth directly, and see how he responds …

  11. Eli Rabett says:

    A sensitivity of 2 C is just as bad as 4 C, just that your grandkids get it in the neck, rather than your kids. Of course, thanks to the miracle of discount rates, standard economics takes full advantage of this to wash the problem away. Lukewarmism doesn’t help, because even if climate sensitivity is low, given current emission paths we are still headed for disaster.

    Lukewarmers not only have to postulate a low value for climate sensitivity, they need low emissions paths (which means controls) and they need to assume that damages in a 3 C warmer world have been greatly overestimated. You can believe that, but it is a lot to swallow in one gulp.

    • Steven Mosher says:

      A sensitivity of 2 C is just as bad as 4 C, just that your grandkids get it in the neck, rather than your kids. Of course, thanks to the miracle of discount rates, standard economics takes full advantage of this to wash the problem away. Lukewarmism doesn’t help, because even if climate sensitivity is low, given current emission paths we are still headed for disaster.

      ######################

      Lukewarmism describes a scientific position about climate sensitivity. It has no position about policy, damages, benefits,
      dangers, etc. The position is simply this: It is more likely that
      sensitivity is less than 3C rather than greater than 3C. That’s our position, articulated back in 2008-9.

      Second. The difference between 2C and 4C is both time and money.
      Every degree C of sensitivity of worth Trillions of dollars.

      http://www.newton.ac.uk/programmes/CLP/seminars/2010120916101.html

      If lower sensitivities are correct then you have wider time windows to deploy renewables, and you avoid sunk cost on renewable technology that is not ready for prime time. More on this later, but we now face a world where near term damage from climate change is a reality: The next 30 years of extreme weather is already in the pipeline and that argues for using resources for adaptation. Dead parents can’t take care of your grand children.

      “Lukewarmers not only have to postulate a low value for climate sensitivity, they need low emissions paths (which means controls) and they need to assume that damages in a 3 C warmer world have been greatly overestimated. You can believe that, but it is a lot to swallow in one gulp.”

      We dont need to postulate anything about emissions. Lukewarmer is a position on science, not prognostications about future emissions.

  12. Pingback: Linkdump from Facebook 2

  13. Arno Arrak says:

    ….it’s the emissions and concentrations that most matter….

    Nonsense, Zeke. Lets see what really matters. First you are pushing for positive water vapor feedback to get those high sensitivities when water vapor feedback is not positive but actually negative. Second, judging by CMIP5 graphs you show, their projections of warming are way off reality. Not only do they ignore the current temperature standstill but they ignore another standstill of eighteen years, between 1979 and 1997. That is slightly longer than the present one. In between them is the super El Nino of 1998. It brought much warm water across the ocean which created a step warming. In only four years global temperature rose by a third of a degree Celsius and then stopped. There has not been any warming since then and there was none before it either, back to 1979. To visualize all this, see figure 15 in my book. After concluding that the standstill of the eighties and nineties exists I started complaining about the late twentieth century warming which wipes it out. This phony warming also blocks in the super El Nino and appropriates its step warming to itself. You don’t have to take my word for this because apparently someone listened. Last fall GISSTEMP, HadCRUT, and NCDC all changed their eighties and nineties temperature values to get rid of that phony warming. Nothing was said and you still have to use my method of analyzing the ENSO oscillations to clearly see it. Do not smooth the curve but put a semi-transpartent overlay on it. Then put dots in the middle of every line connecting an El Nino peak and its neighboring La Nina valley and connect the dots. Do that for the eighties and nineties in the satellite graph (or in the new version of those three venerable curves) and you get a horizontal straight line stretching out for eighteen years. A standstill, that, as I said. The two standstills, the one in the eighties/nineties and the present one, can be regarded as parts of a single standstill, interrupted by the super El Nino of 1998 and its step warming. And this leaves no time for any greenhouse warming for the entire satellite era, back to 1979. In terms of sensitivity, these years tell us that the sensitivity is zero. Hansen makes a big deal of the observation that nine out of ten warmest years happened after 2000. That is not surprising because they all sit on yop of the high platform created by the step warming. There has been no warming this century but Hansen has to muddy the waters and claim a phony warming anyway. For instance, he claims that 2005 and 2010 both were warmer than the super El Nino of 1998. This is a lie. 2005 is total fabrication and is no different than its neighbors. 2010 is the peak year of the 2010 El Nino but still lower than 1998. But you can’t use El Nino peaks for global mean temperature. As I mentioned, to get the mean you must average the El Nino peak and its neighboring La Nina valley temperatures. The La Nina 2008 is the one closest to the 2010 El Nino. It is also the one that bamboozled Trenberth in Climategate files! The midway point between the two lines up neatly with the warm platform created by the step warming of 1998. The origin of the step warming is oceanic. But to Hansen the non-existent greenhouse warming of the non-existent late twentieth century warming created it. Up to now we have seen the absence of greenhouse warming for the last 34 years. What about the earlier warming(s)? Here Ferenc Miskolczi’s theory comes in. He published a full exposition of it in 2007, a 40-page Magnum Opus his opponents still don’t understand but nevertheless have been criticizing and ignoring. He followed it up in 2010 and published observations verifying the predictions he had made. Using NOAA database of weather balloon observations that goes back to 1948 he showed that the absorption of infrared radiation by the atmosphere had been constant for 61 years. At the same time, the amount of carbon dioxide in the air increased by 21.6 percent. This substantial addition of CO2 had no effect whatsoever on the absorption of IR by the atmosphere. And no absorption means no greenhouse effect, case closed. To understand this result, it is not true that carbon dioxide itself did not absorb – it was the atmosphere of the earth that did not absorb. And that is because it also contains other greenhouse gases. Arrhenius theory which the IPCC still quotes did not include them and is incomplete. Miskolczi theory tells us that when several greenhouse gases simultaneously absorb IR there exists an optimum absorption window that they jointly maintain. And that is required for a stable atmosphere to exist. This is an important consideration because Hansen has been threatening us with a runaway greenhouse effect like that one on Venus if we don’t stop burning fossil fuels. Unfortunately, despite having been an astronomer on the Pioneer Venus probe he is totally ignorant of the geology of Venus and knows nothing about the origin of Venusian atmosphere. The major greenhouse gases in our atmosphere are carbon dioxide and water vapor. If, for example, we increase the amount of carbon dioxide overall absorption will increase. Water vapor will then diminish to make up for it and thereby re-establish the previous optimum absorption level. This is equivalent to negative water vapor feedback that I referred to in the beginning. It negates the existence of greenhouse warming, period. The observation that there is no warming today is just one consequence of this fact. Another one is that any and all climate models that use the greenhouse effect to predict warming are invalid. And any and all laws and regulations that were passed with the aid of such predictions were passed under false premises and must be nullified. I know Zeke that this is not what you were inculcated to believe in. But if I were you I would use my head, study hard what Miskolczi has done, and get off that sinking ship called AGW.

    Ref: Arno Arrak, “What Warming?” (CreateSpace 2010) – Amazon has it

  14. Leonard Weinstein says:

    Zeke,
    How many more years of little to no warming (or even dropping temperature) do you need before you would admit you have been had. If warming suddenly continues up even nearly as predicted over the next several years , I would admit I may be wrong, so I do follow the real world. By the way, a more realistic precautionary principal would be to prepare for a cooling world, which would have a much more negative outcome than a little warming, and which is much more likely for two reasons:
    1) We are approaching the end of the Holocene if past periods of interglacials are any guide.
    2) We are likely entering (near term) a little ice age based on solar activity.

  15. Climate change is a very serious concern and everybody should be sensitive about it. The consequences of Climate change are terrible.
    Dr.A.Jagadeesh Nellore(AP),India