UCAR Staff Notes masthead
Home Our Organization Research News Center Education Community Tools Libraries
About Staff Notes
Past Issues
Favorite Photos
Feedback
How to Subscribe
Search

 

staff notes header

November 2005

Planning a national supercomputing center for the geosciences


NCAR is laying the groundwork to create a computational equivalent of the Hubble telescope

With the Mesa Lab almost at its limit for providing electricity and floor space for increasingly powerful supercomputers, NCAR has been scouting alternative sites for new generations of machines. But instead of focusing on a supercomputing facility purely for atmospheric scientists, the organization has started to work with NSF and potential partners in industry and academia to develop a national supercomputing center that would serve the entire geosciences community.

An artist's rendition of the proposed new supercomputing center. (Courtesy The Crosby Group.)

Scientists envision an expandable facility that would eventually house supercomputers with peak speeds in excess of one petaflop (1015 operations per second). In contrast, today's supercomputers have peak speeds that are measured in teraflops, or trillions of operations per second. The new facility is being designed with sufficient flexibility to accommodate many generations of increasingly powerful machines to support cutting-edge geoscience research.

"This is extremely important, probably the number-one UCAR/NCAR priority," explained UCAR president Rick Anthes at the October meeting of the UCAR Management Committee (UMC). "It is a science-driven project and is exactly what a national center should
be doing."

Supercomputing space is a major concern not just at NCAR, but also at a number of other scientific institutions. With NSF interested in establishing a single supercomputing center to serve a broad segment of the research community, NCAR has emerged as an ideal organization to manage the center because of its long and successful track record of serving the Earth sciences community. A consortium of partners, rather than just NCAR and UCAR, would likely govern the center.

The center is a critical component of NSF's strategic plan for strengthening cyberinfrastructure for the sciences. "NSF's intent is to gain leadership in high-performance computing for the geosciences," NCAR director Tim Killeen said at the October UMC meeting.

"It is a science-driven
project and is exactly what a national center should be doing."

—Rick Anthes

Current plans call for the new facility to have 20,000 square feet of raised-floor computer space, which would eventually be increased to 60,000 square feet. It would be built on a 10- to 15-acre site and be powered by up to 13 megawatts of electricity.

Such a center would lead to greatly expanded scientific computing capability. For example, atmospheric researchers would be able to model regional climate on such a fine scale that they could capture individual mountain ranges and ocean currents. They could also fully integrate climate and weather models and simulate detailed cloud, ocean, land, and ice processes.

The center's supercomputers would enable geoscientists to better predict seismic activity and glean insights into Earth's inner core. In addition, researchers would use the center to assimilate data from increasingly sophisticated instruments into computer models to learn more about Earth's weather and climate, as well as biogeochemical and biogeophysical processes and space physics.

A presentation assembled by NCAR and UCAR staffers working on the project describes the facility as "a computational equivalent of the Hubble telescope for geoscience simulation."

The timeline

Lawrence Buja. (Photo by Carlye Calvin, UCAR)

NCAR's goal is to have the new center operational by 2009. However, it must first tackle a number of issues:

• Where should the facility be located? NCAR is looking into several potential sites in Boulder County as well as elsewhere along the Front Range. The site could be developed on an industrial or university campus, or on vacant land near Boulder.

• How should the facility be governed? NCAR has been discussing the operation of the facility with potential partners in government, industry, and academia. No decisions have been made, but it's possible that a broad consortium of organizations may manage the center.

• How will it be financed? The facility is expected to cost $75 million to build, and $15 million a year to operate (about twice the budget to operate NCAR's current suite of supercomputers). Funding may come from a mix of government and private sector sources.

Additional issues also need to be addressed, such as who would own the center. One possibility is that NCAR could lease the facility from another organization.
An NCAR project committee, appointed by Tim and co-chaired by CGD's Lawrence Buja and HAO's Peter Fox, is studying the various issues. The committee is consulting with a blue-ribbon panel of NSF-funded scientists from across the geosciences. SCD's Aaron Andersen and F&A's Jeff Reaves are also working on the plan.

"It's been an exhilarating experience working with so many of the nation's top scientists and the talented experts here to develop this important facility," Lawrence says.

Helping to oversee the process is a UCAR executive committee, which consists of Al Kellie, director of the Computational and Information Systems Laboratory; Katy Schmoll, UCAR vice president for finance and administration; and Larry Winter, NCAR deputy director. UCAR will submit a proposal for the center to NSF this winter, and funding may be secured later next year.

• by David Hosansky


Also in this issue...

HIRDLS comes through

Planning a national supercomputing center

RAINEX: Bad weather is good news

UCAR policy on classified research

Random Profile: Meg McClellan

COMET project wins recognition

Delphi Questions

Just One Look: Super Science Saturday


 

Staff Notes home page | News Center