Search  
DOE Pulse
  • Number 386  |
  • April 15, 2013

Berkeley Lab Tackles Next-Generation Climate Models

A Community Atmosphere Model version 5 (CAM5) visualization of water vapor.

A Community Atmosphere Model version 5
(CAM5) visualization of water vapor.

Tornadoes, twisting winds that descend from thunderheads, and derechos, winds that race ahead of a straight line of storms, are just two varieties of extreme weather events whose frequency and violence are on the increase. To keep up with nature, climate models are running at ever-higher resolutions, requiring ever-greater processing speeds and altered computer architectures.

Michael Wehner is a climate scientist in the Computational Research Division (CRD) of Lawrence Berkeley National Laboratory who focuses on just such extreme weather events. He notes that simulations run with a low-resolution climate model can give completely contrary results from a high-resolution version of the same model.

“My conclusion from a 100-kilometer model is that in the future we will see an increased number of hurricanes,” Wehner says, but a more realistic simulation from 25-kilometer model yields a highly significant difference: “The total number of hurricanes will decrease but the number of very intense storms will increase.”

Finer resolution in time as well as space is what pumps up the yield. “To look at extreme weather you need high-frequency data,” says Wehner. “A model run can produce 100 terabytes of output.” That’s 100 trillion basic processing units.

It’s the reason why a dataset that would take Wehner 411 days to crunch on a single-processor computer takes 12 days on Hopper, a massively parallel supercomputer at the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab. That’s still not good enough. Wehner thinks it should take an hour.

Models that can now process petabytes (a quadrillion bytes) per second will soon need to accommodate exabytes (a quintillion bytes) per second. John Shalf, Berkeley Lab researcher and NERSC’s Chief Technology Officer, is leading an effort to achieve exascale performance.

“Data explosion is occurring everywhere in the Department of Energy ... genomics, experimental high energy physics, light sources, climate,” Shalf says. “We need to rethink our computer design.”

What was once the most important factor in computer performance will soon become the least important. “In 2018 the cost of FLOPS” – floating point operations per second – “will be among the least expensive aspects of a machine, and the cost of moving data across the chip will be the most expensive. That’s a perfect technology storm.” To rise above it, Shalf says, “We’re simulating hardware before it is built,” modeling hardware for exascale systems to predict their performance.

“Tools of yesteryear are incapable of answering these questions on climate,” says Shalf’s colleague Wes Bethel, who heads CRD’s Visualization Group. But he stresses the good news. “Datasets are getting larger, but there’s more interesting science and physics hidden in the data, which creates opportunities for asking more questions.”

Berkeley Lab recently hosted the fifteenth in a series of annual workshops bringing together top climatologists and computer scientists from Japan and the United States to exchange ideas for the next generation of climate models, as well as the hyper-performance computing environments that will be needed to process their data.

[Julie Chao 510.486.6491,
jhchao@lbl.gov]