About the Author

Uncertainty in Climate Science: Not Cause for Inaction

By Juan M. Restrepo and Michael E. Mann

“The climate has changed and is always changing,” Trump administration spokesman Raj Shah said when asked about the evidence for climate change reported in the Fourth National Climate Assessment from the U.S. Global Change Research Program. Shah echoed prior assertions by climate contrarians that current changes in climate and weather are not unusual. Fluctuations—some of which have been extreme—have occurred prior to the industrial era. In more technical terms, this claim intimates that climate has a stationary statistical distribution, one that does not change with time. Additionally, it suggests that samples of this distribution have manifested as possibly rare, extreme highs in recent years. Shah also implied that the presence of uncertainties makes climate forecasting unreliable.

To explore the assertion of a static climate distribution, we propose a null hypothesis: that record values of a stationary time series occur with a specific frequency. A record is defined as the largest (or smallest) value to date. We also examine how incorporation of historically-informed uncertainties in natural and anthropogenic factors, including human-generated greenhouse gases (GHG), modifies climate predictions; we do so via a simple model that captures the essential phenomenology of the radiation balance described by more complete state-of-the-art climate models. This energy balance model (EBM) is used to determine whether uncertainties in GHG emissions or other factors lead to climate projections that differ qualitatively from those obtained without accounting for uncertainties.

Records in Time Series

We invoke the null hypothesis that surface temperatures are samples from a stationary distribution. We then test whether a theorem that applies to stationary distributions—such as one about record highs and lows—is borne out by the data [2].

We draw a time-ordered sequence of independent and identically-distributed samples \(X_1\), \(X_2\), \(\ldots\) from a stationary distribution and define a sample from the sequence as a record high (or low) if its value is higher (or lower) than the preceding samples. The probability of a record high is \(P_n := \mbox{Prob}[X_n >\max\) \(\{ X_1, X_2, \ldots, X_{(n-1)} \}]\) (with obvious modifications for a record low). In a sample set of size \(n\), each value has an equal chance of being the highest or lowest — thus, \(P_n = 1/n\). \(\mathbb{E}(R)\) represents the expected number of records for a stationary random sequence of size \(n\), given by the harmonic series \(\mathbb{E}(R) = 1 + 1\!/\!2 + 1\!/\!3+\cdots + 1\!/\!n\). For large \(n\), \(\mathbb{E}(R) = \gamma + \log(n)\), where \(\gamma\) is the Euler constant.

If the theorem applies to temperature data, we expect to wait increasingly long intervals for each new record temperature value because the probability declines as \(1\!/\!(t-t_0)\). Here the time \(t\) of each temperature observation takes the place of the statistical index \(n\), and \(t_0\) is the start of the particular temperature observations (we would also expect similar rates of record highs and lows if the probability distribution were symmetric).

Figure 1a compares the record highs and lows obtained from a synthetic random time series to the July temperatures measured at the Moscow station, from about 1880 to 2011. The highs and lows are similarly spaced in time for the random time series, but the Moscow temperature data shows many early lows and none after about 1910. By contrast, the record highs are more evenly spaced and continue through the observation period. The data suggests that the theorem is not fulfilled and the rate at which record highs or lows occur at time \(t\) does not follow \(1\!/\!(t-t_0)\).

Figure 1. Records in Northern Hemisphere temperature time series. 1a. Records in a synthetic stationary distribution (left) and of July monthly temperatures at the Moscow station (right). 1b. Temperature data, as a function of time, for 30 arbitrary locations in the Northern Hemisphere. 1c. Record values for the seven temperature time series highlighted in 1b. The adjusted temperature subtracts the first temperature value in the time series. The data is taken from the Goddard Institute for Space Studies (GISS) repository, and temperature is recorded in Celsius. We note that there is a time in each data set beyond which no new lows occur, whereas new highs continue to appear as time progresses. Figure created by Juan Restrepo and Michael Mann using data from GISS.

Figures 1b and 1c plot temperature data from 30 Northern Hemisphere locations, chosen at random but mostly concentrated in temperate zones. The time series are not of equal length, and some stations did not report every year. The annual mean temperatures shown in Figure 1b—which highlights seven arbitrarily-chosen temperature time series—indicate a long-term warming trend. But qualitatively at least, it is not obvious that a stationary temperature distribution is an unacceptable statistical model for mean temperatures. Figure 1c highlights the records associated with the seven data points. To facilitate comparison, we adjust these data sets by subtracting the first temperature data point in the set (leading to an adjusted temperature of zero for each time series). Adding more observations to the top or bottom set does not change the fact that one can expect more high than low records over time (the low records stop occurring). The record highs and lows tell a clearer story: they do not obey the \(1/t\) dependence, establishing that they must not sample from a stationary process.

Incorporating Uncertainties in GHG Projections

Global estimates of GHG emissions are readily available [1] and have tightly-constrained uncertainties, since they are critical to the energy sector of the economy. Uncertainties are associated with changing policies regarding carbon emissions, including international treaties and carbon pricing and the potentially time-varying nature of natural carbon sinks and sources. However, we focus here on variability spanning several decades to hundreds of years and the largest of spatial scales. A simple balance is used to estimate the temporal evolution of the global temperature \(T\).

We can explore the extent to which GHG uncertainties—derived from a statistical analysis of the historical temperature and forcing data—affect conclusions of future temperature projections. This allows us to compare natural and anthropogenic GHG forcings to determine whether the outcomes’ sensitivity depends on the relative uncertainties in these two GHG components. We can also infer whether natural or anthropogenic forcings are dominant, both prior to and during the industrial era and in the future.

Black body radiation theory tells us that Earth’s radiation is proportional to \(T^4\). The surface energy balance, in terms of surface temperature \(T\), is \(C dT\!/\!dt = Q +\kappa  \sigma T_{Atm}^4 - \sigma T^{4}\), where \(T_{Atm}\) is the atmospheric temperature, \(t\) is time, \(C\) is the effective heat capacity, and \(\sigma\) is the Stefan-Boltzmann constant. \(Q\) represents the effective incoming radiation. If \(C_a dT_{Atm}\!/\!dt\) is small, where \(C_a\) is the effective atmospheric heat capacity, then \(\kappa \sigma T^4 + 2 \kappa \sigma T^4_{Atm} \approx 0\) and \(C dT\!/\!dt = Q -(1-\frac{\kappa}{2}) \sigma T^4\). Since the temperature range is not large, \((1-\frac{\kappa}{2}) \sigma T^4 \approx A + BT\), where \(A\) and \(B\) are constants. The energy balance is spectrally dependent. The high frequency component has one portion that mostly dissipates and another that reflects back to space via clouds and snow/ice. On the other hand, reflectivity and a complex layer of gas, dust, and droplets capable of trapping surface outgoing radiation affect the low frequency component. Let us assume that \(Q\) is a linear combination of the effective solar radiation and GHG-induced radiative forcing. Hence, \(Q = \frac{1}{4}(1-\alpha) S +  F_{GHG}\), where the albedo \(\alpha \approx 0.3\) and the global average solar radiation is presently \(S\!/\!4 \approx 1370\!/\!4 \:\textrm{Wm}^{-2}\). The EBM we adopt is thus

\[C dT = \frac{S}{4}(1-\alpha) dt + F_{GHG} dt -\:(A+B T)dt  + \nu(t) dt,\]

where \(T\) is the temperature of Earth’s surface (approximated as a 70-meter-deep, mixed-layer ocean covering 70 percent of the surface area). \(C = 2.08 \times 10^8 \textrm{J}\: \textrm{K}^{-1} \textrm{m}^{-2}\) is the effective heat capacity that accounts for the thermal inertia of the mixed-layer ocean; however, it does not allow for heat exchange with the deep ocean. The last term in the model is a stochastic forcing term, which represents inherent uncertainties and unresolved processes.

Figure 2 depicts a single realization of temperature predictions that accounts for natural and anthropogenic forcing and their variability (see Figure 3). The long-wave emissivity’s upward trend still dominates any uncertainties due to natural and man-made forcings during the Industrial Revolution. Ultimately, the steadily-increasing carbon dioxide forcing overwhelms natural factors in temperature prediction during the industrial era and into the future, even when accounting for variability and uncertainty due to natural volcanic and solar forcing of climate.

Figure 2. Temperature predictions, including uncertainties, for various equilibrium climate sensitivities (ECS). 2a. Highlight of the composite forcing (see Figure 3) corresponding to the period 1850-2100. 2b. Temperature predictions as a function of ECS, taking into account uncertainties due to carbon dioxide emissions, volcanic activity, and solar forcing. Stochastic variability due to temperature uncertainties is included. Historical temperature variability data informs the stochastic model of temperature fluctuations. From left to right, equilibrium climate sensitivity equals 4.5, 3, 2.5, 2, 1.5. Figure courtesy of Juan Restrepo and Michael Mann.

Summary

Using observational surface temperature data, we show that temperatures around the Northern Hemisphere do not exhibit a time-stationary distribution. To explore the causal factors behind the observed non-stationarity, we drive a simple zero-dimensional EBM with estimated natural and anthropogenic forcings. The key natural forcings are associated with volcanic emissions and insolation changes, while anthropogenic forcing is primarily due to the warming effect of GHG increases from fossil fuel burning, which is accompanied by a secondary offsetting cooling influence from sulphate pollutants. We explain the effect of inherent uncertainties on projections of future global temperature, constructing historically-informed statistical models for the variability of the forcings that account for factors in stochastic influences. One must invoke both natural and anthropogenic forcings for the model simulations to agree with instrumental temperature data.

Our calculations indicate that warming is a result of anthropogenic increases in GHG concentrations, a finding that is robust with respect to uncertainties in the forcings as represented by stochastic models. Moreover, since the effect of forcing variability is small compared to the upward trend of anthropogenic forcing, inherent variability cannot prevent further increases in global temperature without a slowdown in anthropogenic forcing, i.e., a cessation or decrease in GHG emissions. Scientists using more sophisticated state-of-the-art climate models reach the same conclusions, and have been unable to find a plausible non-anthropogenic explanation for the observed warming and increase in warm extremes during the anthropogenic era [3]. We find no evidence that future natural radiative forcing contributions could substantially alter projected anthropogenic warming; the impact of their variability would contribute to “known unknowns” in temperature uncertainty. The model’s longtime features agree well with historical data and thus do not require the introduction of epistemic variability (“unknown unknowns”) in the model.

Figure 3. Stochastic long-wave, short-wave forcing and composite total forcing, with uncertainties due to carbon dioxide emissions, volcanic activity, and solar forcing. A single stochastic realization is depicted. Figure courtesy of Juan Restrepo and Michael Mann.

Shah’s appraisal of the outcomes in the Fourth National Climate Assessment motivated us to demonstrate the ways in which simple, well-established, quantitative methods can address apparent challenges posed by uncertainties in climate assessments. Because key climate change attributes, such as ice sheet collapse and sea level rise, are occurring ahead of schedule [4], uncertainty has in many respects turned against us. Scientific uncertainty is not a reason for inaction. If anything, it should inspire more concerted efforts to limit carbon emissions.

Article partially adapted from “This is How ‘Climate is Always Changing,’” published in the American Physical Society Physics GPC Newsletter, Issue 9, February 2018.

Acknowledgments: We would like to thank Barbara Levi, who provided invaluable editorial assistance with this article.

References
[1] Boden, T.A., Marland, G., & Andres, R.J. (2017). Global, Regional, and National Fossil-Fuel CO2 Emissions. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy. Retrieved from http://cdiac.ess-dive.lbl.gov/trends/emis/overview_2014.html.
[2] Foster, F.G., & Stuart, A. (1954). Distribution-free tests in time-series based on the breaking of records. J. Royal Stat. Soc. Ser. B, 16, 1-22.
[3] Mann, M., Miller, S., Rahmstorf, S., Steinman, B., & Tingley, M. (2017). Record temperature streak bears anthropogenic fingerprint. Geophys. Res. Let., 44(15), 7936-7944.
[4] Mann, M., & Toles, T. (2016). The Madhouse Effect: How Climate Change Denial Is Threatening Our Planet, Destroying Our Politics, and Driving Us Crazy. New York, NY: Columbia University Press.

Juan M. Restrepo is a professor in the Department of Mathematics at Oregon State University. He has courtesy appointments in the College of Earth, Ocean, and Atmospheric Sciences and the Department of Statistics. Michael E. Mann is Distinguished Professor of Atmospheric Science at the Pennsylvania State University. His most recent book, with Tom Toles, is The Madhouse Effect: How Climate Change Denial Is Threatening Our Planet, Destroying Our Politics, and Driving Us Crazy.