Minute by Minute
Close Up With Weather's Worst
Whether stationed on a lonely Midwest road or a deserted Atlantic beach, NCAR scientists and engineers train high-tech sensors on tornadoes, hurricanes, and severe thunderstorms in some of meteorology’s most intrepid research. Back at the lab, their data reveal new threads of similarity among many types of wild weather. Software experts at NCAR and elsewhere use these leads, plus innovative statistical techniques, to refine new models that help give forecasters and the public better guidance on weather at its worst.
A pioneer predictor of twisters
Long before “tornado watch” became a household phrase, John Finley was issuing daily tornado outlooks.
While serving in the U.S. Signal Service on the East Coast, Finley indulged his interest in tornadoes by compiling an 88-year climatology. He also recruited hundreds of tornado observers across the growing nation. Each spring during the mid-1880s, Finley issued daily outlooks for tornado likelihood in 18 districts from the Great Plains eastward. Despite some evidence of skill, Finley’s technique was soon abandoned as the nation’s weather service shifted to civilian hands. Almost 70 years later, the U.S. Weather Bureau began issuing tornado watches once more, this time from Kansas City—the centralized point Finley himself had recommended.
AT THE UNIVERSITIES
A better starting point in the forecasting race
As a postdoctoral researcher at NCAR during 1997–99, Gregory Hakim (University of Washington) forged collaborations that continue today.
Hakim’s keenest goal now is to improve data assimilation, the techniques by which diverse observations are fed into computer models. “This is a problem that affects both weather and climate—the whole shebang. Being able to synthesize models and observations is crucial for everybody.” His tool of choice is the ensemble Kalman filter, a mathematical technique that allows errors in the starting-point analyses of models to be quantified in detail. In standard analyses, “there’s no measure of error or uncertainty,” says Hakim. With the EnKF, as many as 100 or more analyses are produced for the same time and place, and the group average and variance can be calculated, providing a new window on why a given forecast might go right or wrong. “Having this probabilistic information actually gives new ways of looking at the dynamics of weather systems. That, to me, has been one of the huge, unexpected rewards.” With NSF support, Hakim is working with Christopher Snyder (NCAR), Thomas Hamill (NOAA), and Fuqing Zhang (Texas A&M University) on the EnKF studies as part of NCAR’s Data Assimilation Initiative.
A new eye on water vapor's invisible flow
Although thunderstorms came under the scrutiny of weather radar as early as the 1950s, water vapor in the atmosphere eluded frequent monitoring. Those days are numbered, though, thanks in part to a UOP-led team that includes John Braun.
In the mid-1990s, Braun and colleagues began extracting data on water vapor from the delays in Global Positioning System (GPS) signals induced by moisture in the atmosphere. A related technique will be used on a space-based monitoring system called COSMIC. The team has made big strides on the ground as well. More than 100 GPS receivers now track water vapor every half hour, many of them at university-operated sites through an NSF-funded initiative called SuomiNet. With these data, scientists can estimate the total amount of precipitable water above each station. An improvement of that technique measures water vapor along individual signal paths, so that signals arriving from satellites low on the horizon can be used to sample water vapor along low-lying horizontal swaths stretching tens of miles. Braun and colleagues are developing methods to synthesize the SuomiNet data with other sources and feed them into computer forecasting models. “Long-term studies are starting to show a significant positive value,” says Braun. “We can really start looking at how the water vapor changes over time. That allows us to validate models, to really understand a part of the hydrologic cycle, and make improvements in water-cycle science.”
Clearing the way to top-notch radar data
When the National Weather Service began upgrading its radar network nearly two decades ago, university scientists watched with some envy. Their access to the new data was often limited to a subset of current reports from a single nearby radar, plus archived data that could take weeks to receive. Now, university scientists and private firms can obtain up-to-the-minute, high-quality data from more than 120 radars, the result of a partnership championed by UOP’s Linda Miller.
A former administrator at NOAA and Vaisala, Miller joined UOP’s Unidata Program Center in 1990. “I really enjoy working with people in the academic community, government, and the private sector on various data issues and community needs,” says Miller. She and Unidata colleagues teamed with Kelvin Droegemeier (University of Oklahoma) and other key leaders to address institutional and technical hurdles that blocked speedy access to the new radar data. With the team’s help, NOAA secured bandwidth on the Internet2 network and chose four sites to funnel the vast amounts of data to universities (at no cost) and private industry (on a cost-recovery basis). The enhanced data include the higher-elevation scans needed to fully profile intense thunderstorms, especially those close to a radar site. “It’s really exciting to see our technology reaching beyond our traditional user community,” says Miller. “We worked through the issues in a collaborative environment with the stakeholders, and now we’ve got something that works.”
Even for storm-seasoned residents of the U.S. heartland and Atlantic coast, 2004 yielded a bumper crop of atmospheric angst. A total of nine named tropical systems pummeled the coastline from Texas to North Carolina, including an unprecedented string of billion-dollar hurricanes—four in all—that raked Florida. Nationwide, more than 1,500 tornadoes were reported, a preliminary total that far exceeds the U.S. record for any prior year.
As NOAA’s National Weather Service leads the way toward U.S. weather safety, NCAR plays a key role behind the scenes. With NOAA and university colleagues, NCAR develops and deploys some of the newest and most innovative tools for spying on bad weather from the ground and the sky. Those data feed into conceptual models that, in turn, help forecasters better understand growing threats, including the multiday unfolding of heavy-rain episodes. Vital new software developed through interagency teamwork entered the toolbox of forecasters in 2004 (see Test-driving tomorrow's forecast model). And one of NCAR’s newest weather sensors may help the nation protect itself from a different kind of atmospheric threat, one seldom pondered before September 11, 2001.
A super-sized storm chase
Christopher Davis calls it “storm chasing gone mesoscale.” In the spring and early summer of 2003, this NCAR scientist co-managed the center’s biggest-ever foray into the springtime severe weather that regularly plagues the Midwest.
The massive study area for the Bow Echo and MCV Experiment (BAMEX) extended north to Minnesota, south to Louisiana, west to Kansas, and east to Ohio. This enabled participants to track not only a single day’s storms—including the bow-shaped echoes associated with destructive straight-line winds—but also the mesoscale convective vortices (MCVs) that sometimes emerge from the remains of the day’s tempests. Ranging in width from 80 to 300 kilometers (50–200 miles), these low-pressure centers can focus thunderstorms the following day. If they move over water, MCVs can even serve as seed for a tropical cyclone.
Bow echoes and MCVs are poorly sampled by routine, twice-daily weather-balloon launches separated by hundreds of kilometers. “They’re just not adequate,” says Davis. For six stormy weeks, BAMEX filled in the gaps, using mobile observing systems from NCAR, Doppler radars aboard P-3 jets from NOAA and the Naval Research Laboratory, parachute-borne sensors dropped from a Learjet, and other tools.
Early results from BAMEX point to a spectrum of vortex behavior whose vigor and extent scientists hadn’t expected. For instance, analysis led by Nolan Atkins (Lyndon State College) and Robert “Jeff” Trapp (Purdue University) found that the greatest damage was typically observed not in the most extensive bow echoes, but in smaller ones spanning 100 km (60 mi) or less. Even smaller spin-ups within these small bows appear to focus most of the destructive winds and pose the highest risk for tornadoes. BAMEX also uncovered some red flags that may prove useful in forecasting high wind, such as bands seen on radar that converge and feed into a storm from right angles.
At least two of the MCVs in BAMEX extended to the lowest kilometer of the atmosphere, a depth that surprised researchers. The apparent kinship between these systems and incipient tropical cyclones intrigues Davis. “There’s remarkable continuity in the spectrum of cyclones as you go from midlatitudes to the tropics, especially in the early stages of development,” he says.
Picking the tornado producers
The most spectacular duo of storms in BAMEX occurred on June 22 in southern Nebraska. One cell, near Aurora, dropped the largest U.S. hailstone ever measured, at 7 inches (nearly 18 centimeters) wide. Less publicized was the next cell south, near Superior, which bore the largest and strongest storm-scale cyclone ever measured. It was “the mother of all mesocyclones,” according to Roger Wakimoto, who will move from the University of California, Los Angeles, to head NCAR’s Earth Observing Laboratory in mid-2005.
Wakimoto captured the record-setting vortex using the airborne Electra Doppler Radar, built with the capacity to measure high winds in and near storms by NCAR and CRPE, France’s center for Earth and planetary physics research. During the Superior storm, the radar detected wind shear aloft that exceeded 400 km/hr (about 250 mph) across just a few miles. This shear was so strong that ground-based radar couldn’t accurately interpret it.
Could fractal geometry help solve this dilemma? NCAR postdoctoral researcher Huaqing Cai is exploring the notion. Fractal theory emphasizes how scales of measurement—the length of a ruler, as it were—can obscure some aspects of natural systems. While a mile-long ruler would give one measurement of a jagged coastline, for example, it would miss smaller curves.
Cai applied this technique to radar data on five mesocyclones (including the one near Superior), three of which spawned tornadoes. He analyzed the data with a variety of horizontal grid scales, and for each storm, he examined how the maximum vorticity (circulation around an axis) changed with different scales.
Cai found that vorticity and scale are related by a power law: a finer grid shows a bigger ramp-up in vorticity as one approaches the mesocyclone. In those cases when tightening the grid produced the biggest ramp-ups, a tornado was most likely to be present. If this relationship is confirmed by more data, it could eventually lead to on-the-fly radar analysis to distinguish the most tornado-prone storms. “This isn’t a tornadogenesis theory,” Cai adds. “It doesn’t tell you why a mesocyclone produces a tornado. It just tells you which one might.”
A single Doppler radar can only detect winds blowing toward or away from itself. That gives a useful but incomplete picture of the complex wind fields that swirl around intense vortices. An NCAR-designed technique for extracting riches from single-Doppler data is yielding insight into the processes at the heart of both tornadoes and hurricanes.
NCAR’s Wen-Chau Lee had hurricanes in mind when he developed his ground-based single-Doppler wind retrieval technique (known by its acronym, GBVTD), as the technique assumes the presence of a good-sized vortex. But GBVTD soon found its way into tornado research, spurred by the dramatic data being collected across the Great Plains by mobile Doppler radars.
One of the pioneers of storm-chasing radar is Howard Bluestein (University of Oklahoma). He and colleagues recently applied GBVTD to high-resolution data gathered from Kansas and Nebraska twisters by a mobile W-band radar built at the University of Massachusetts Amherst. The radar’s fine detail shows an array of intriguing features, including concentric rings in the Kansas tornado akin to the double eyewalls found in some hurricanes.
Through GBVTD, Bluestein discovered even more: a two-lobe structure, with pockets of especially high wind at opposite ends of the two tornadoes. This pattern, long noted in some lab-generated tornadoes, could shed light on internal dynamics of the vortex. What’s more, the lobes stayed in position around the twister, unlike the multiple vortices that spin around some tornadoes—and in contrast to theory and computer-based simulations. “This was surprising,” notes Bluestein, who believes the lobes could be a product of the tornado being deformed by the boundary along which it developed.
Another veteran of radar road trips, Joshua Wurman (Center for Severe Weather Research), is teaming with Lee to derive new insight from a decade of data. Wurman is working closely with NCAR to refine his Doppler on Wheels units, which had a busy 2004. The DOWs profiled 36 tornadoes and tracked the landfalls of Hurricanes Frances and Ivan, gathering data every 10 seconds on Frances. “It was an amazing year,” says Wurman.
Lee and Wurman have been working on the first three-dimensional tornado analysis produced using GBVTD. Drawing on DOW data from a large and deadly 1999 tornado that struck Mulhall, Oklahoma, they’ve found updrafts blowing at near-hurricane force, surface air pressure comparable to that of a Category 4 hurricane, and a central downdraft not unlike the subsidence in the eye of a hurricane. Lee notes “striking similarity” between characteristics derived from the Mulhall data and those observed for years in laboratory- and computer-generated twisters. “We hope to be able to look at more cases and produce a climatology for a spectrum of tornadoes,” he says.
Lee’s technique is being considered by the National Hurricane Center as a way to extract quick estimates of a hurricane’s peak winds and central pressure as it approaches land. For instance, Lee found that GBVTD could have drawn on coastal radar data in August 2004 to provide valuable early notice of Hurricane Charley’s rapid intensification just before landfall.
A new eye on national security
If tornadoes and hurricanes still terrorize much of the United States, the nation as a whole began pondering other types of risk in earnest after September 11. There’s a meteorological element to one potential form of attack: the airborne dispersion of biological hazards, such as anthrax or smallpox.
A new type of lidar (laser-based radar) developed at NCAR has proven to be an uncommonly keen observer of trouble in the air. It detects not only the microscale air motions that might carry biological agents, but many of the tiny particles that could represent the hazards themselves.
The Raman-shifted Eye-safe Aerosol Lidar (REAL) emerged from several years of collaboration among NCAR scientist Shane Mayor, optical and systems engineer Scott Spuler, and associate scientist Bruce Morley. They combined traditional and newer techniques to build a lidar that would operate in an eye-safe range of wavelengths (making it easy to deploy in urban environments) yet would pack plenty of observing power.
The high-end optics allow aerosols to be tracked by REAL every tenth of a second at distances separated by as little as 3 meters (10 feet). The resulting image—which looks much like a radar display, only in miniature—captures phenomena rarely before observed by researchers. “A radar sees reflections from big targets like raindrops and bugs,” says Spuler. “We’re seeing reflections from particles about 10,000 times smaller.”
The lidar’s first big test came at the Pentagon in the spring of 2004, where a breakthrough blend of high-tech instruments and weather forecasting models took shape. Coordinated by scientists at NCAR and sponsored by the Defense Advanced Research Projects Agency, the tests scanned for potential airborne hazards near the Pentagon and predicted their motion and impact on the building.
REAL, along with other observing tools and a set of nested computer models, charted complex air circulations around the Pentagon at resolutions as fine as every 2 meters (7 feet). At the Pentagon, REAL’s display lit up with signals from the myriad of aerosol plumes common in the center of a busy, densely populated urban area. At home next to NCAR’s Foothills Laboratory, REAL watches the daily ebb and flow of pollutants and routinely captures trails of exhaust from freight trains passing near the site.
Late in 2004, a test in Utah showed REAL’s ability to detect clouds of particles standing in for more dangerous aerosols, such as anthrax spores. While the lidar can’t distinguish an innocuous aerosol from a lethal one, says Mayor, “REAL could be used as a first layer of remote sensing that would trigger additional sensors when needed.”