|The WRF team includes (left to right) Shu-Hua Chin, principal implementer of model physics; overall coordinator Joseph Klemp; William Skamarock, head of the working froup for dynamic model numerics; and John Michalakes, head of the working group for software architecture, standards, and implementation. Not shown are Jimy Dudhia, head of the working group for workshops, model distribution, and community support; and Dave Gill, implementer for Web pages and real-data testing. (Photo by Carlye Calvin.)|
WRF development has been a collaboration among scientists at NCAR's Mesoscale and Microscale Meteorology (MMM) Division, NOAA's National Centers for Environmental Prediction (NCEP) and Forecast Systems Laboratory (FSL), the University of Oklahoma's Center for Analysis and Prediction of Storms, and the Air Force Weather Agency. WRF will offer resolution that's about an order of magnitude better than existing operational mesoscale models. "When we look down the road to greater computer power, we want to have horizontal grids of a couple kilometers so we can resolve small- scale weather features as they're evolving," says Joseph Klemp, who is leading the development effort at MMM.
The bare-bones version has a basic set of physics packages and standard real-data initialization for the users to work with. Getting this version ready for release has been a tradeoff, Klemp says. "We'd like interested users to contribute to the development process, but we don't want to frustrate them. They have to understand it's not the final version." For example, the physics packages that were ported to WRF had to be recoded to interface with WRF's other layers, so "there may be interaction problems."
WRF has a three-layer structure. John Michalakes, a visiting computer scientist from Argonne National Laboratory who is doing WRF development in MMM, explains: A driver layer deals with computer architecture (and also such issues as managing nested grids) so that the user can run the model on distributed-memory, shared-memory, vector, or cluster machines without having to modify it. Theoretically, WRF's driver layer could be used for other modelsincluding general circulation models. However, Michalakes points out that it would have to be modified to deal with, for example, spectral transforms and coupling among component models, since these features don't yet occur in WRF.
Scientists who focus on the algorithms for physics and dynamics can work solely in the other main layer. Joining this "model" layer to the driver layer is a "mediation" layer,which Michalakes describes as "a glue layer that has to know a little bit about both other layers so they can interact."
This structure gives WRF a flexibility that will be needed to serve both researchers and forecasters. The idea of a product that could meet the needs of these disparate groups grew from a more modest collaborative effort in MMM. "Within the division, we have typically had half a dozen [separate] models of significant complexity," says Klemp. "We had cloud-scale models for basic research in idealized applications, and the MM5 [Mesoscale Model 5] was good for real data but not for idealized simulations. In research, you often start with a very simple, idealized problem and work your way up to the full-blown problem. We could see the value of doing all that on a single model."
As cloud and mesoscale modelers in MMM began talking about pooling their resources, they recognized that their product might also reduce some of the delays that typically take place between the birth of an innovation in the research community and its adoption by operational meteorologists. "There was rapid recognition among all of the participating organizations that there was value in developing a common modeling system," Klemp says. "With WRF, at least there's a potential for streamlining a lot of technology transfer."
The development effort for WRF is impressive in several respects. For one thing, it has gotten started without a lot of WRF-specific funding. "We've been trying to forge ahead on the resources available," says Klemp. That has certainly had an impact on the pace of work: "A few critical people are moving things forward, so when someone takes a two-week vacation, it throws our schedule back two weeks."
For NCAR, it may be more significant that the development team, which includes software engineers and scientists, works together very well (see p. X). Klemp says, "Our success is in developing a real team attitude. [The engineers] don't just tell us what to do and leave us to do it or not; there's a lot of going back and forth until we agree on the best way to do it." Michalakes concurs: "There's a joint appreciation, respect, and feeling of ownership by the respective members of the team."
More information on WRF can be found on the Web.