The year 2004 was a record-setter for U.S. tornadoes (the preliminary
total was 1,722, including the one above near Mulvane, Kansas,
Often the answer lies somewhere between yes and no. In that vast land of uncertainty, climate scientists are working to clarify the relationships between lumbering changes in climate and the flashpoints of extreme weather. They’re applying finer-scale models to the task and sharpening definitions of what constitutes an extreme.
The impetus for much of this work is the upcoming fourth assessment by the Intergovernmental Panel on Climate Change. A wide array of simulations, now being finalized at a variety of modeling centers, will feed into the next IPCC report in 2007. Similar work is supporting the U.S. Climate Change Science Program, which launches a study of North American extremes this year.
As the latest results trickle in, some are confirming patterns of climate change already observed. But there is still precious little knowledge about trends in some of the weather features that threaten and frighten people most.
Linda Mearns. (Photo by Carlye Calvin
“There’s a push on climatologists to say something about extremes, because they are so important. But that can be very dangerous if we really don’t know the answer,” says NCAR’s Linda Mearns, an IPCC lead author. She’s heading up one of the key crosscutting activities at NCAR: an initiative to bolster the techniques of weather and climate impact assessment.
Putting weather extremes in a climate-change context is difficult for several reasons. For one, extremes are, by definition, rare, which makes statistical conclusions far more difficult to draw. Another is the tendency for the wildest weather to affect areas smaller than global climate models can depict. Even getting a handle on extremes in everyday phenomena, such as temperature, isn’t as easy as it might seem.
“What you can do is limited by the data that exist,” says NCAR’s Kevin Trenberth, convening lead author for the climate observations section of the upcoming IPCC report. “When you’re trying to look at extremes, you’re much more subject to problems in the data and changes in the instrumentation.”
Other hurdles are sociological. Some nations, with an eye on the commercial value of their weather data, won’t release long-term statistics for open research use. “In many parts of the world, enough daily data have been digitized to contribute to an analysis, but institutions are reluctant to part with them,” says Thomas Peterson (NOAA National Climatic Data Center).
Partially as a way around these restrictions, a landmark set of five regional workshops took place from Cape Town, South Africa, to Pune, India, over the last year. Organized by a joint expert team from the World Meteorological Organization and the CLIVAR (Climate Variability and Predictability) project, these meetings brought sets of outside experts together with local participants for several days of seminars and hands-on analysis.
While the meetings served as a skill-building exercise for many participants, the results had immediate value. Each group used daily data for its region to compute changes over the last few decades in a set of 27 indices of climate change, most of them related to extremes.
“So far none of these workshops have been able to release time series of daily data,” says Peterson. “However, we’ve had great success in reaching agreement to release the indices of changes in extremes.” A global summary of the results will be ready in time for the 2007 IPCC report.
Wetter, warmer, and less frosty?
Where the data exist, scientists have been able to make increasingly bold conclusions about current and future trends in extreme weather. One of the first such studies, led by Thomas Karl (now director of the National Climatic Data Center) in the 1990s, found that an increasing share of U.S. precipitation was falling in more intense bursts. The findings are in sync with the idea that a warmer global atmosphere would carry more water vapor, which in turn should fuel heavier rains.
Later work with daily data confirmed this trend for the United States and other midlatitude areas. Some parts of the African and Asian tropics have reported a decrease in the number of very wet days. However, sparse data on precipitation over the oceans make it hard to piece together a global picture. As for the future, global climate models to date have lacked the fine detail to assess downpours that may only last a few hours and affect areas smaller than a model can see.
“Extremes are things that happen at very precise locations, especially for precipitation,” notes Claudia Tebaldi, an NCAR statistician who works with the output from global model runs.
Jerry Meehl. (Photo by Carlye Calvin
Temperature departures, which can stretch across vast areas and last for weeks, are easier for global models to capture. Tebaldi teamed with NCAR colleagues Douglas Nychka and lead author Gerald Meehl for a pair of 2004 papers, one in Science that analyzed future global changes in heat waves (defined several ways) and the other in Climate Dynamics analyzing frost days (those on which the temperature dips below 0°C or 32°F).
It was the first time Meehl had closely examined day-by-day output to study extremes from a global model—in this case, the Parallel Climate Model, sponsored by the U.S. Department of Energy and based at NCAR. “The model did a pretty good job of simulating both frost days and heat waves, which surprised me,” says Meehl.
One of the most striking results is a projected shift in circulation favoring northwest flow across both North America and Eurasia. This pulls maritime air into the western fringes of both land masses, which in turn leads to warming across the Pacific Northwest, western Canada, and northern Europe (see graphic below). The model projects lesser but still significant warming across the eastern parts of North America and Asia. This overall pattern fits well with the trends of recent decades, says Meehl.
A study using the Parallel Climate Model projects that the average number of frost days per year will decrease across much of the globe by the 2080s, with the largest decreases (red and orange) across the western fringes of North America and Europe. (Illustration courtesy Gerald Meehl and Claudia Tebaldi.)
Although it’s difficult to attribute cause and effect for one extreme season, the winter of 2004–05 could serve as a poster child for the trends in the Meehl-Tebaldi study. Oregon and Washington have seen recurrent bouts of record warmth, and the area’s ski industry has been paralyzed by a lack of snow.
Meehl and Tebaldi are now planning a collaboration with NCAR colleague Grant Branstator to see if the global-scale changes they’re projecting are triggered by shifts in tropical convection across south Asia and the African Sahel. They’ve also been working with colleague Julie Arblaster to generate output from the NCAR-based Community Climate System Model for a set of 10 top-priority extremes designated by the IPCC (see sidebar at right).
Toting up tornadoes
In the eyes of a global climate model—where any object smaller than 30 kilometers (about 19 miles) wide is hard to discern—tornadoes are invisible. It’s up to people like Harold Brooks (NOAA National Severe Storms Laboratory) to keep track of them and divine how their frequency might change in the coming century.
Tornado counts across the United States have risen steadily since the 1950s, thanks to storm chasers and improved means of detection. The same process is now unfolding in parts of Europe and Australia, says Brooks. But the strongest tornadoes, those with winds topping 320 km per hour (200 mph), don’t seem to have become any more or less frequent since U.S. records began a few decades ago. The worst of the lot are those rated F5 on the Fujita-Simpson intensity scale, a strength reached by only about one in a thousand tornadoes in North America, or one per year on average. This makes F5s so rare that statistics become meaningless. “It’s impossible to say if there are any trends in F5 tornadoes,” says Brooks.
Scientists are trying to improve estimates of the global prevalence of severe thunderstorms, both now and in the future. (Photo © Adam Houston.)
To infer how the global atmosphere might be changing in ways that favor or suppress severe storms, Brooks and Aaron Anderson (now at Weathernews Inc.) have been sifting through global reports of temperature, wind, and moisture from 1958 to 1999. With support from the NCAR impacts assessment initiative, Brooks is cataloging the patterns of instability and wind shear that support severe weather, even in places where nobody may be looking for hail or tornadoes or taking note of them. He expects to complete the analyses later this year and make them available to researchers.
Jeff Trapp. (Photo courtesy Purdue University.)
Meanwhile, at Purdue University’s new Climate Change Research Center, Jeff Trapp, Matthew Huber, Noah Diffenbaugh, and colleagues hope to simulate severe weather more directly, using global, regional, and cloud-resolving models. “I think it’s time to start looking at the details, with the realization that you have to temper the results with the limitations of the models,” says Trapp.
He and his peers propose to nest the International Center for Theoretical Physics’ RegCM3 regional model, which began at NCAR, within global models. Using precipitation and other RegCM3 output, the team would identify possible severe-weather outbreaks and then track those with a higher-precision cloud-resolving model, perhaps a variant of the multiagency Weather Research and Forecasting model. They’ll watch to see if WRF can produce the correct types of thunderstorms for past events, such as 1974’s record-setting outbreak of tornadoes across the eastern United States. If so, then the strategy could be used in a forward-looking mode to shed light on how severe weather might evolve in future climate regimes.
“We’re not trying to predict what exactly is going to happen in the year 2050,” says Trapp. “What we’re trying to understand is the range of future behavior on the mesoscale and local scale.”
Whither the lows?
The findings of the Purdue team may hinge in part on the evolution of midlatitude cyclones. These broad centers of low pressure can trigger severe thunderstorms, blizzards, and other extreme events as they sweep from coast to coast. Such cyclones are a particular concern in Europe, where in wintertime they produce some of the continent’s worst weather.
According to recent work by Nicholas Graham (Scripps Institution of Oceanography) and Henry Diaz (NOAA Climate Diagnostics Center), substantial changes have occurred in midlatitude cyclones over the last 50 years. Across the Northern Hemisphere, these lows appear to be forming more often and more intensely, packing stronger peak winds. However, sparse data over the oceans throw some uncertainty into the mix. The findings also run counter to model projections that indicate, except for a few regions, a general weakening of midlatitude cyclones with climate change as pole-to-equator temperature contrasts diminish.
Closer to the equator, the fate of tropical cyclones is another wild card. At NOAA’s Geophysical Fluid Dynamics Laboratory, Thomas Knutson is analyzing how a new regional climate model generates hurricanes over the Atlantic Ocean. In studies of future changes in hurricane and typhoon frequencies, global models hardly provide a consensus, he says. “Some models show increased frequency and some show a decrease. They sort of splatter all over the place.”
Regardless of their frequency, future hurricanes and typhoons might well become more intense. Knutson found that a variety of model configurations tended to agree on an overall rise in peak winds and rainfall rates as greenhouse gases increase. By 2080, assuming a buildup of 1% per year in carbon dioxide levels, the worst tropical cyclones could pack winds roughly 12 km per hour (8 mph) higher, on average, than present-day peak levels.
New statistical treatments are helping to illuminate the character of extremes. The graph at left shows June-to-August temperatures over the past century in Basel, Switzerland, in degrees Celsius. The plot shows the 25%, 50%, and 75% quantiles (red, black, and blue lines) of temperatures reached by the warmest to coolest 25% of days in each summer. In the right panel, the same temperatures are plotted relative to the median for each summer. This reveals that the peak of the 2003 heat wave was less extreme than those of heat waves in the 1940s, even though the summer average (median) temperature was substantially warmer in 2003. (Illustrations courtesy Martin Beniston and David Stephenson; for more details, see Beniston, M., and D. B. Stephenson, 2004: Extreme climatic events and their evolution under changing climatic conditions. Global and Planetary Change, 44, 1-9).
Whether such a trend would be detectable isn’t clear. Events like hurricanes and tornadoes vary so strongly from week to week, year to year, and decade to decade that any long-term, climate-driven trend could pale by comparison. Disagreements on how to discuss such subtleties in the public realm can be quite sharp. In January, NOAA hurricane researcher Christopher Landsea withdrew from his IPCC role as contributing author. Landsea claimed that a Harvard University news conference on 21 October, which included NCAR’s Trenberth, presented a stronger case than warranted for the impacts of recent and future climate change on hurricanes. A transcript of the Harvard news conference is available (see “On the Web”).
Ultimately, a dual question hangs over the entire realm of extremes and climate change: Will we really know when extremes have changed, and will we be able to prove that such shifts are associated with human-induced climate change? It’s what specialists call the detection/attribution problem.
Europe’s catastrophic heat wave of 2003 has raised both scientific and public interest in attribution. A landmark paper in Nature last December, written by Peter Stott (Hadley Centre for Climate Prediction and Research) with colleagues at the University of Oxford, is one of the first to leap both hurdles of the detection/attribution problem. The authors conclude that around 75% of the risk of a 2003-strength European heat wave is due to human influence on climate. They add that if greenhouse gases increase at the rate projected in one IPCC scenario, then the conditions observed in Europe’s 2003 heat wave would be considered unusually cool by 2100. The team used a statistical technique called optimal detection analysis, applied in previous work for global-scale attribution of climate change but used here for the first time for a particular climate event.
While the Nature paper compared the 2003 heat wave to average summer readings from 1961 to 1990, David Stephenson (University of Reading) notes that the average itself is rising. He’s interested in how peaks and valleys of temperature might change relative to a baseline that’s slowly moving upward. “Do the extremes warm up more than the mean? That’s the idea, but how to study it properly is the problem.” Stephenson will tackle such issues as he coordinates the extremes component of a massive five-year study called ENSEMBLES, now getting under way in the European Union (see “On the Web”).
Christopher Ferro, also at Reading, and Stephenson recently developed a technique that analyzes distributions based on quantiles (percentages, such as the top or bottom 25% in a range) rather than centers or endpoints. This technique, says Ferro, can “highlight differences, such as changes in the tails of a distribution, that have important, practical consequences.” Scientists in the climate-statistics game have also been calling on a Web-based toolkit created by NCAR’s Richard Katz and Eric Gilleland for gleaning trends from climate data sets (see “On the Web”).
Extremes aren’t just solo threats; they can gang up, too. “For human health, the clustering of extremes is extremely important,” says Stephenson. This topic came to the fore last summer at an NCAR workshop on heat waves and impacts, led by Mearns and physician Jonathan Patz (University of Wisconsin–Madison). Heat waves often occur in tandem with episodes of ground-level pollution, for example. Along these lines, Mearns and Patz hope to extend the heat-wave projections from Meehl and Tebaldi and look for possible trends in mortality and morbidity triggered by heat-and- pollution clusters.
When it comes to extremes and climate, says Mearns, “You’re not immediately going to get the answers. It takes time. But we really have to start focusing on some of these more difficult problems.” ®
Ten extremes to watch for
|In its analysis of extremes, the 2007 IPCC report will focus on 10 indices published by Povl Frich (now at Denmark’s National Environmental Research Institute) and colleagues in a 2002 Climate Research paper. Modeling groups around the world are now producing output keyed to these variables.|
Total number of frost days
Days with minimum temperature below 0°C (32°F)
Intra-annual extreme temperature range
Growing season length
Heat wave duration
Warmth of daily lows
Average daily precipitation
Influence of very wet days on annual total
NCAR modelers are using the Community Climate System Model to analyze 20th- and 21st-century trends in 10 extremes identified by the IPCC. This graphic shows the CCSM3 depiction of changes from 1880–99 to 1980–99 in the number of days with precipitation greater than 10 millimeters (0.39 inches). Stippled areas indicate regions where the trend is statistically significant at 90% confidence. (Illustration courtesy Julie Arblaster, NCAR.)