FOR RELEASE AFTER:
March 11, 1997
EMBARGOED UNTIL March 13, 1997
P.O. Box 3000
Boulder, CO 80307-3000
Telephone: (303) 497-8611
Fax: (303) 497-8610
NCAR is managed by the University Corporation for Atmospheric Research under sponsorship by the National Science Foundation.
MSU readings for the lowest several kilometers have been averaged and yearly trends calculated since 1989. These show a drop in global temperature of -0.03 to -0.05 degree Celsius per decade since 1979. More traditional global temperature averages taken near the ground show a rise of about 0.1 degree C/decade over the same period. The difference in trends has been a subject of spirited debate because of its implications for the projection and measurement of global warming. (The projected rate of warming is typically around 0.2 degree C/decade.)
In their Nature article, Hurrell and Trenberth argue that the MSU data, while useful for many purposes, are poorly suited for gauging long-term surface temperature trends. MSUs monitor the globe more thoroughly than surface reports, which are concentrated over land and approximated over oceans. However, each MSU lasts only a few years, to be replaced by another deployment on a different satellite. According to the NCAR scientists, the transitions between satellites may be producing spurious temperature drops that mask an actual rise in global readings. "The surface and MSU records measure different physical quantities," write Hurrell and Trenberth, "so that decadal trends should not be expected to be the same." However, they add, "unreconciled discrepancies among the different records remain."
To study the matter further, the scientists focused on the tropics between 20 degrees N and S, where "noise" from short-term weather variations is lower than it is in temperate and polar zones. Hurrell and Trenberth compared simultaneous MSU records to each other, to sea-surface temperatures (SSTs), and to air temperatures simulated by an NCAR climate model using SSTs. They found that most of the difference between MSU and surface trends could be explained by two significant drops in MSU data for 1981 and 1991, years when satellite transitions took place.
Some newspaper and magazine articles now cite only the MSU or only the surface data in reporting on global temperature trends, without noting the counterpart to each. Hurrell and Trenberth stress that both data sets are needed to unravel the mysteries of global climate. "The MSU data are excellent for analyzing year-to-year changes, but not necessarily for longer-term trends," says Hurrell.
However, Trenberth notes, a climate model is only as realistic as the theoretical understanding behind it and the complexity allowed in it. Computer resources, while growing rapidly, still restrict the detail and sophistication of current models. NCAR's climate system model, for example, requires weeks of actual time for a single 100- or 200-year climate simulation. "Computing power is one key to future progress," says Trenberth. Another is to improve the representation of common processes such as cloud formation and ocean circulation in order to minimize the number of "flux adjustments"--shifts in energy, water, and momentum exchange that are artificially prescribed in order to make a model more stable. These adjustments run the risk of causing unforeseen and unrealistic side effects in the modeled climate.
In his article, Trenberth describes a strategy for carrying out climate experiments that removes much of the impact of flux adjustments and other potential sources of error. However, this strategy does not eliminate the possibility of complicated feedback effects. Among other sources of difficulty, clouds represent "probably the single greatest uncertainty in climate models," notes Trenberth. "The enormous variety of cloud types, their variability on all space scales . . . and time scales (microseconds to weeks) poses a special challenge."
To help gain confidence in model results, Trenberth advocates the use of such tools as sensitivity tests, to see how much a result varies with small changes in the input conditions or model procedures, and simplified models, which require less computer time, to check approximations and assumptions. He also suggests that the burden of proof for claims that model results are incorrect should be on the critic, not the modeler.
For policymakers hoping for guidance from computer models, Trenberth emphasizes the value of using pooled knowledge and results from a number of different models, such as those used in the estimates from the Intergovernmental Panel on Climate Change of a projected global warming from 1.3 to 2.9 degrees C by the year 2100. "Statements such as these, given with appropriate caveats, are likely to be the best that can be made because they factor in the substantial understanding of many processes included in climate models. Such projections cannot offer certainty, but they are far better than declaring ignorance and saying nothing at all."
To receive UCAR and NCAR press releases by e-mail,
contact Milli Butterworth
telephone 303-497-8601 or email@example.com
The National Center for Atmospheric Research and UCAR Office of Programs are operated by UCAR under the sponsorship of the National Science Foundation and other agencies. Opinions, findings, conclusions, or recommendations expressed in this publication do not necessarily reflect the views of any of UCARs sponsors. UCAR is an Equal Opportunity/Affirmative Action employer.