Common Climate Misconceptions

Modeling the Climate

Few climate change topics arouse more passion than the seemingly dry field of climate modeling.

Critics thunder that the models contain a “large element of subjectivity” with parameters “tweaked by those who operate the models” to achieve results that conform to scientists’ preconceptions. Some seem to think that these models simply represent a grandiose exercise in curve fitting, forecasting future climate based on the trend in temperatures over the past few decades.

The General Circulation Models (GCMs) that drive future climate predictions are incredibly complex models of generally well-known physical processes, and the world’s most powerful supercomputers are usually built for climate modeling. These models will never be perfect, but there are many reasons their projections warrant careful consideration.

Climate models are based on known physical laws and observed climatic behavior. Gavin Schmidt, a climate modeler from NASA Goddard, on divides the physics involved in climate modeling into three categories:

The first includes fundamental principles such as the conservation of energy, momentum, and mass, and processes, such as those of orbital mechanics, that can be calculated from fundamental principles. The second includes physics that is well known in theory, but that in practice must be approximated due to discretization [making mathematically discrete] of continuous equations. Examples include the transfer of radiation through the atmosphere and the Navier-Stokes equations of fluid motion. The third category contains empirically known physics such as formulas for evaporation as a function of wind speed and humidity.

Really not Much that can be ‘Tweaked’

Despite their complexity, climate models contain surprisingly few parameters that can be “tweaked” by modelers. The dozen or so values that are subjective relate primarily to the initial state of the climate system at the onset of the model run, and the uncertainties generated by variations in initial states tend to average out after a decade or two. The effects of other parameter choices are taken into account by running a sensitivity analysis on a variety of climate models using a range of probable parameters. In general, the average result across a range of different climate models tends to be more accurate than the result of any single climate model in explaining observed climate changes. As a result, Schmidt says, “errors in the simulations are surprisingly unbiased.”

Modeling future climate changes requires a number of discrete steps. First, future emissions of greenhouse gases and other atmospheric forcings (both human and natural) must be estimated. This step introduces one of the single largest sources of uncertainty, as it involves future economic growth, population growth, rate of technology development, remaining fossil fuel reserves and resources, the extent of international conflict and cooperation, and numerous other variables.

Instead of trying to predict any one future, the Intergovernmental Panel on Climate Change (IPCC), in an attempt to provide a reasonable range of future emission trajectories, published a Special Report on Emission Scenarios (SRES) that laid out four broad storylines. New post-SRES scenarios help further narrow the range of future trajectory by excluding some projections that appear unrealistic (e.g., a world population in excess of 15 billion in some of the upper-end scenarios).

With future emissions trajectories outlined, modelers next calculate the radiative forcing associated with each of these scenarios. This step involves adding up the amount of energy captured by various greenhouse gases (in watts per meter squared of additional energy) and subtracting the cooling effects of tropospheric and stratospheric aerosols. Projections for changes in natural forcings (e.g., solar output, orbital variations, and volcanic activity) are also included, but they tend to be small relative to the magnitude of anthropogenic forcings.

All of these calculations use well-known physical properties of the gases and particles involved, and the uncertainty is relatively small for most factors. The major exception is aerosols, for which considerable uncertainty remains regarding the magnitude of direct and indirect aerosol effects.

Radiative Forcing Components
View larger image
Radiative forcing of various factors influencing the climate. Figure from IPCC Fourth Assessment Report Working Group One Summary for Policy Makers. (LOSU = Level of Scientific Understanding)

Finally, climate forcings are used to project actual temperature changes by calculating the effects of various climate feedbacks. Prominent feedbacks include increased atmospheric water vapor concentrations (the result of increased evaporation and atmospheric water retention), increased cloud formation, and changes in the reflectivity of Earth’s surface (albedo) resulting from decreases in ice and snow extent.

The influence of clouds on climate remains a major source of uncertainty. In this step, a discrete change in forcings (e.g., a doubling of carbon dioxide from 280 parts per million to 560 parts per million) can lead to a range of potential temperature changes based on uncertainties involving climate feedbacks. In the case of doubled carbon dioxide levels, for instance, the temperature change can range from 2 to 4.5 degrees C.

Comprehensive Climate Model
View larger image
Elements of a climate model. Figure from IPCC Fourth Assessment Report Working Group One Chapter Ten.

Given the uncertainties involved, why should climate models over hundreds of years be trusted? One often hears, for example, that simply predicting the weather over a few days can be highly problematic. William Connolley of the British Antarctic Survey and James Annan of the Frontier Research System for Global Change in Japan explain fundamental differences between weather and climate forecasts:

Although ultimately chaos will kill a weather forecast, this does not necessarily prevent long-term prediction of the climate … We cannot hope to accurately predict the temperature in Swindon at 9am on the 23rd July 2050, but we can be highly confident that the average temperature in the UK in that year will be substantially higher in July than in January … [M]odels based on physical principles also reproduce the response to seasonal and spatial changes in radiative forcing fairly well, which is one of the many lines of evidence that supports their use in their prediction of the response to anthropogenic forcing.

They conclude that, “the calculation of climatic variables (i.e., long-term averages) is much easier than weather forecasting, since weather is ruled by the vagaries of stochastic fluctuations, while climate is not.”

No climate model can ever be perfect, simply because there are no other Earths on which we can conduct long-term climate-wide experiments. Individual elements of climate models can be validated in the lab (e.g., the amount of outgoing longwave radiation trapped by various greenhouse gases), but the emergent behaviors that occur when disparate forcings and feedbacks are combined are difficult to test empirically.

‘Hindcasting’ of Past Climates to Test Models

Climate models can be tested, however, by “hindcasting” past climates. Models can reproduce many features of past climates such as oceanic cooling during the last ice age, temperature changes over the past century, more rapid increase in nighttime than daytime temperatures, a larger degree of warming in the Arctic than in other areas of the Earth, and other past changes. However, climate models are to some degree parameterized based on observations of past climate behaviors; such tests are not a completely independent validation of climate models.

Truly assessing the effectiveness of climate models requires comparing future climate projections to observed temperature changes. While complex climate models have not existed long enough for highly robust validation, and we do not have the luxury to wait and see if they are correct, we can compare earlier generations of models to observed changes over the past two decades or so.

In 1988, NASA scientist James Hansen, in congressional testimony, presented a model of expected climate changes over the next 30 years. He gave three scenarios based on differing rates of future greenhouse gas emissions: a high “A” scenario, a medium “B” scenario, and a low “C” scenario. Actual radiative forcing and temperature changes have occurred roughly at the rate predicted in the “B” scenario, which Hansen at the time identified as the most likely.

While the uncertainties involved in temperature changes over such a short time scale mean that it is difficult to significantly differentiate between the temperature changes forecast in scenarios “B” and “C”, NASA’s Schmidt argues that “the model results were as consistent with the real world over this period as could possibly be expected and are therefore a useful demonstration of the model’s consistency with the real world. Thus when asked whether any climate model forecasts ahead of time have proven accurate, this comes as close as you get.” Climate models were also able to accurately predict the short-term global cooling following the eruption of Mt. Pinatubo in 1991.

Annual Mean Global Temperature Change
View larger image
Hansen’s 1998 scenarios compared to empirical temperature observations. Figure from RealClimate.

Rather than opaque and subjectively parameterized black boxes, climate models are based on fundamental physical principles and observed climatic behaviors. While significant uncertainties in future climate projections are likely to remain for the foreseeable future, climate projections based on “models’ rather than empirical experiments provide valuable insights into an otherwise imponderable future.

Zeke Hausfather

Zeke Hausfather, a data scientist with extensive experience with clean technology interests in Silicon Valley, is currently a Senior Researcher with Berkeley Earth. He is a regular contributor to The Yale Forum (E-mail:, Twitter: @hausfath).
Bookmark the permalink.

Comments are closed.