UCAR > Communications > UCAR Quarterly >Fall 2002 Search


Fall 2002

What are the odds?
Pinning down climate change for policy makers

by Bob Henson

If we’re stuck with limited knowledge about how the climate will behave in 10 or 100 years, perhaps there’s still a way to describe the uncertainties and random elements more concretely. It’s this hope that kept Linda Mearns and about 80 other physical and social scientists grappling with the unseen and unknown in a workshop this summer on uncertainty and climate impacts—a meeting that one participant dubbed “historic.” Sponsored by NSF and by NCAR’s Environmental and Societal Impacts Group, where Mearns is deputy director, the workshop took place in Boulder on 17–19 July.

Climate roulette: The likely amount of global temperature change in the next 100 years can be significantly altered with enactment of stringent policy measures, but uncertainty remains. (Illustration by Michael Shibao from a concept by Mort Webster, University of North Carolina.)

The meeting took on new importance with President Bush’s announcement earlier this year that federal global change research would soon be consolidated through an interagency Climate Change Science Program (see sidebar). The workshop’s timing made it well suited to serve as a “kickoff meeting” for the program, according to NCAR director Tim Killeen. Workshop participants plan to draft a five-year action plan by November. It will feed into the CCSP’s timetable, which includes a stakeholder meeting in December and a strategic plan by next March.

How to get past the cascade

Decision support tools and “comparative information” are key elements in the new landscape of federal climate research. When scientists turn in this direction, they tend to produce scenarios—“what if” portraits that show how a given outcome would result from a given climate sensitivity, an assumed rate of population and industrial growth, and so on. The rub is that each scenario is based on assumptions that stand on the shaky shoulders of other assumptions, a sequence that Mearns calls “the daunting cascade of uncertainty.”

Public policy analyst Edward Parson (Harvard University) noted that in one report from the Intergovernmental Panel on Climate Change (IPCC), “the uncertainty of future emissions scenarios is huge,” ranging from 50% to 500% of present-day levels by 2100.

Although these unknowns might seem insurmountable, they can be addressed through careful analysis. Parson and others at the workshop pointed to the critical role of output variables in whether or not a given scenario proves useful. Is a temperature rise of 4.5°F (2.5°C) dangerous, and if so, for whom? “Simple models that only deal with temperature have seen their day,” declared NCAR’s Kevin Trenberth. Susan Solomon (NOAA) agreed: “We’re going to have to describe things differently than we have in the past.” Impacts—the number of heat waves above some criterion, the frequency of tides above a cutoff point—seem to be what policy makers find most useful.

In one recent example, projected changes in California’s snow pack caught the eye of that state’s water managers. Consultant Lester Snow (Saracino-Kirby-Snow) noted that the April-through-July runoff levels in California have dropped by from 45% to 35% of the average yearly rate over the last century. A separate model-based analysis by Noah Knowles (Scripps Institution of Oceanography) shows a 20% to 40% reduction in average annual snow pack by 2040. The projection got its zing only after being translated by Maury Roos, a California state hydrologist for 40 years and, “a guy with relevance and standing in the water-use community,” according to Snow. Thus, credibility among a particular user group—and the ability to tailor findings directly for that group—can enhance the value of a climate scenario.

Under one roof:
Bush reorganizes federal climate research

In the biggest shift for federal global change research in more than a decade, President George W. Bush has announced the formation of a Climate Change Science Program. The CCSP will encompass projects carried out by 12 agencies, including work now done under the auspices of the U.S. Global Change Research Program (USGCRP). The umbrella program will be headed by NOAA’s deputy administrator, James Mahoney.

”We have the opportunity to make a difference,” Mahoney told participants at the July U.S. workshop on uncertainty and climate impacts. In testimony delivered to Congress a few days before, Mahoney called for a ”period of differentiation and strategy investigation” as the main theme of CCSP. The focus is on ”identifying the scientific information that can be developed within two to five years to assist the nation’s evaluation of optimal strategies to address global change risks.” The new program will emphasize three areas: continued scientific inquiry, upgraded measuring and monitoring systems, and an increased focus on decision support tools.

A workshop to be held in December in Washington, D.C., will solicit scientist and stakeholder comment on the CCSP. In March 2003, Mahoney plans to release the nation’s first overall research plan on global change since 1990. Many of the ongoing activities of the USCGRP are expected to continue, but with changes in emphasis in some areas. For example, Our Changing Planet, the annual USGCRP report, will increasingly emphasize the analysis of proposed mitigation strategies.

Of course, a single scenario can’t weave the full tapestry of our potential climate. Stephen Schneider (Stanford University) faced this issue long ago. The IPCC’s first assessment report, back in 1990, was one of the first places where he and colleagues tried to openly address climatic uncertainty, according to Schneider. It’s where the “roll of the dice” analogy was introduced by Jerry Mahlman (formerly of NOAA’s Geophysical Fluid Dynamics Laboratory, now with NCAR’s Advanced Study Program).

In 1995, Granger Morgan and David Keith at Carnegie Mellon University asked 16 climate experts for their best guess on the global temperature increase that would result from a doubling of atmospheric carbon dioxide. As Schneider pointed out to workshop attendees, “About half of the people [from that survey] are in this room.” Schneider revealed that he’d rather have seen each of the 16 draw a probability distribution function, or PDF—a curve showing graduated likelihoods across a temperature spectrum—rather than going with a single temperature value. “The process of doing that would probably have substantially improved the assessment.”

Yet, in the end, those who actually set policy will have limited time and attention for sifting through finely detailed climate projections. “Just putting up a PDF or some graphical representation is not a good solution,” said Steven Smith (Pacific Northwest National Laboratory). One way out, he and others suggested, is to attach appropriate caveats to simplified scenarios so that users get the main points and see the limitations at the same time.

What’s next

Among the ideas now being explored for the post-workshop action plan:

• A scenario development program. For nearly two years, European Union countries have been carrying out a coordinated, end-to-end scenario project called PRUDENCE (Prediction of Regional Scenarios and Uncertainties for Defining European Climate Change Risks and Effects). Many workshop participants felt it was high time for a scenario development process that might include a stateside analogue to PRUDENCE, perhaps with a structure that gave somewhat freer rein to individual researchers. The process would take advantage of the gradually increasing resolution of global atmosphere-ocean models that simulate long time spans, as well as shorter runs drawn (as were those for PRUDENCE) from high-resolution regional climate models.

• A distribution center. Just as NOAA now maintains the National Climatic Data Center, participants saw the need for a place to archive, tailor, and distribute climate scenario output. Unlike NCDC, this service wouldn’t be in a single building but would be a distributed archive. Such a multi-site venture could take advantage of the AccessGrid and other collaborative tools now being pioneered for atmospheric research at NCAR’s Scientific Computing Division and elsewhere.

• Liaisons to bridge models and output. Experts are sorely needed to extract the most useful data from existing and future models, then translate the data into terms policy makers can use. As was the case in California, some of these people may be regional or local specialists who can fit scenario outcomes to the geographic areas they’re familiar with.

The most crucial outcome of this workshop could be a new level of sophistication in how climate change is presented to the nation at large. The U.S. National Assessment, mandated by Congress and released in 2000, succeeded in bringing climate science to the public with the first in-depth look at how climate change could affect each part of the country. Yet the assessment was limited in its model base (only two global models were used, both of them several years old) and produced on a tight timeline. The dovetailing of this summer’s uncertainty workshop and the new federal emphasis on scenarios could result in an even richer portrait of the nation’s climatic possibilities the next time around.

The five-year goal, says Mearns, who co-chaired the workshop with NCAR’s Warren Washington, is “a complete North American multiscenario distribution system that will take into account the major known uncertainties surrounding future climate. Scenarios aren’t predictions, but by taking into account all the uncertanties, we can be prepared.”

On the Web:

UCAR annual members meeting

ODGA Washington briefings


In this issue... Other issues of UCAR Quarterly
UCARNCARUOP

UCAR > Communications > UCAR Quarterly > Fall 2002 Search

Edited by Bob Henson, bhenson@ucar.edu
Prepared for the Web by Carlye Calvin
Last revised: Monday, December 30, 2002 10:44 AM