UCAR > Communications > Quarterly > Spring 1999 Search


Spring 1999

A mountain of data: NCAR/NCEP reanalysis project

Where to get the reanalysis data

SCD offers a selection of reanalysis data on CD at a cost of $10 per disk, with a small service charge for each order. Each CD covers one year of global analysis, with a sampling of variables at 12-hour intervals. The CDs are already very popular for both research and education; more than 3,650 have been shipped to scientists in 38 countries. For more information contact Roy Jenne at 303-497-1215 or jenne@ucar.edu. To order data, contact Chi-Fan Shih at 303-497-1833 or e-mail reanl@ucar.edu.

Amidst stacks of data, Roy Jenne holds one of the reanalysis-data CDs. (Photo by Carlye Calvin.)

Weather stations, radiosondes, balloons, ships, buoys, satellites. When you put together the observations from some or all of these for the last half century, you have a research tool of enormous potential. But in their original state, these observations are riddled with pitfalls. Some are deteriorating. Some of what should be comparable data are not. Some are flawed, but it may be hard to tell which. And there are many gaps.

To make these data more useful and reliable, in 1991 NCAR and the National Centers for Environmental Prediction undertook a project to collect, organize, do quality control, and reanalyze all available data from 1948 to 1998. Completed last July, the project has produced a new set of global data for every six hours with a horizontal grid of about 200 kilometers (125 miles) and 28 vertical levels of the atmosphere: a total of 2.5 terabytes of data.

"It really is a lot," says Roy Jenne, head of the Data Support Section (DSS) of NCAR's Scientific Computing Division (SCD). Obviously not a man given to overstatement, Jenne led NCAR's effort to assemble the data for the reanalysis, an endeavor that called for the skills of a librarian, detective, and negotiator as well as scientific expertise.

Jenne emphasizes that the job was only possible with help from around the world, along with a very serious effort from his entire staff. At NCAR, "We've been doing a lot of data collection since 1965, but in Asheville [North Carolina, home of the National Climatic Data Center and also a U.S. Air Force data archive], there was lots of keyboarding in those early years," he says. Back then, data were kept on computer-readable cards. "At one time, Asheville had something like 600 million punch cards. They were stacked up in the hallways, on chairs . . . you couldn't sit down."

The data already archived in major centers were only part of the picture. Jenne's group also collected and processed data sets from various other sources, including tropical aircraft reports from the University of Hawaii, upper-air data from Brazil and Argentina, and much more. They even had to key-enter a lot of station location information. The data include early reports from Canadian weather stations and Cold War-era weather reports from Russia and China gleaned from radio broadcasts.

"If you let these old data sets go," says Jenne, "there's going to be an attrition rate." He mentioned the case of a very poor country in which the meteorological service's old weather charts were finding their way to the local market, where they served to wrap fish. Once data like these are gone, they're gone forever. "But we've at least got enough observations now to give the data assimilation models a fighting chance to make good analyses of the whole atmosphere."

Coming from such diverse sources, the collected data have flaws of various kinds. "When you're working with millions and billions of numbers, there will always be some random errors here and there," Jenne points out. "What we had to do is detect the systematic errors against this background." Most of these came from human mistakes rather than instrument failure. For example, one rawinsonde site had exactly the same measurements every time, a problem that Dennis Joseph of SCD thinks stemmed from a programming error during data processing. For that site, there was no way to retrieve the correct observations. "But usually with a little detective work, you can unscramble the problem," Jenne says. "We haven't had to lose much data."

With all available data collected, there are still gaps for some regions and periods. In the reanalysis, these are filled with data produced by the forecast model using advanced data-assimilation techniques. Jenne has written, "We only need enough observations to define the atmosphere, given the ability of models to fill gaps."

The reanalysis of these neatened and standardized data took place at NCEP with a state-of-the-art analysis/forecast system; Eugenia Kalnay was lead principal investigator. It was considered the largest and most challenging project ever attempted there, just as it had been the largest project for Jenne's group at NCAR. "Everything took much longer than their most pessimistic estimates," Jenne said. "Also, the project was able to add more sources of data than was first deemed possible."

Model details had to be worked out before production started, and then adapted to changes in the world's observing systems. For example, the model initially estimated summer surface temperatures in the U.S. Great Plains about 5 degrees C too high because it incorrectly simulated the browning of winter wheat and subsequent change in the earth's reflectivity. And there was a related problem in the soil. These problems were fixed before production started. Changes like that, from this and other reanalyses, lead to permanent improvements in the forecast models. Jenne says, "I think it's fair to say that reanalysis has been associated with one of the most rapid periods of data-assimilation and forecast model development we've ever seen."

The reanalyzed data support research in a variety of fields. They are archived at NCAR and made available on CDs, tapes, and (in small amounts) over the Internet.

The reanalyzed data are used heavily on the NCAR computers. Although they are only about one-fifth of the total DSS archive in size, in 1998 they accounted for about half of the archive's use. "It's both the new thing on the block and the best thing on the block," remarks Jenne.

"This first long reanalysis is a great start, but more projects are coming that use still better models," he explains. The European Center for Medium-Range Weather Forecasts has completed a 15-year reanalysis, and they will start a long one (about 1957-2000) in late 1999. NCEP will do another long one a little later. "And we are still finding some more observations."


In this issue... Other issues of UCAR Quarterly
UCARNCARUOP

Edited by Carol Rasmussen, carolr@ucar.edu
Prepared for the Web by Jacque Marshall
Last revised: Tue Apr 4 15:10:30 MDT 2000