The World’s Most Powerful Climate Change Supercomputer Powers Up

  • Share
  • Read Later
Carlye Calvin / NCAR

A fish-eye view of some of the Yellowstone supercomputer's 100 racks. An iconic scene from Yellowstone National Park is featured mosaic-style on the ends of each rack.

For all the political discord over climate change, one thing everyone can probably agree on is that when you’re throwing computational resources at modeling weather, the more the merrier.

Think of the new computer that just came online at the NCAR-Wyoming Supercomputing Center in Cheyenne, Wyoming as a kind of dream come true from a meteorological standpoint, then, because it represents a mammoth increase in raw crunch-prowess, dedicated to studying everything from hurricanes and tornadoes to geomagnetic storms, tsunamis, wildfires, air pollution and the location of water beneath the earth’s surface.

(MORE: What, Exactly, Is a Supercomputer?)

Call it “Yellowstone,” because that’s what the federally funded National Center for Atmospheric Research does. It’s a supercomputer, and not just your average massively parallel monster, but a 1.5 petaflop (that’s 1,500 teraflops) IBM-designed behemoth — it can run an astonishing 1.5 quadrillion calculations per second — that as of June 2012 ranks among the top 20 most powerful computers in the world.

Only “top 20”? 1.5 petaflops is nothing to sneeze at. While the fastest supercomputer in the world today, IBM’s “Sequoia” in Livermore, San Francisco, can handle over 16 petaflops, just four years ago the world’s fastest computer (also an IBM machine, dubbed “Roadrunner”) was just celebrating breaking the record for 1-petaflop sustained performance.

According to NCAR, Yellowstone is divided into three primary sections: a blistering-fast performance cluster powered by a whopping 72,288 Intel Sandy Bridge EP processor cores, a massive 144.6 terabyte storage farm and a system for visualizing all of its data.

All told, Yellowstone rates 30 times more powerful than its predecessor, a system known as “Bluefire” that NCAR took possession of back in April 2008. At the time, Bluefire was state-of-the-art; a supercomputer capable of peaking at 76 teraflops (76 trillion calculations per second). To put that in context, NCAR says where Bluefire would take three hours to carry out an “experimental short-term weather” forecast, Yellowstone might render it in just nine minutes. And as you’d expect, the increase isn’t just a matter of raw speed: Yellowstone will also be able to model earth processes of much more daunting complexity.

“The Yellowstone supercomputer will dramatically advance our understanding of Earth,” says Al Kellie, director of NCAR’s Computational and Information Systems Laboratory (CISL) on NCAR’s website. “Its computing capacity and speed will allow us to investigate a wide range of phenomena that affect our lives, with more detail than ever before.”

Sweeping “It’s like Professor Hulk on steroids!” claims about computing power aside, let’s talk calculation specifics. How exactly will Yellowstone, which ran up a tab of between $25 and $35 million, running in a data center that cost around $70 million to build, earn its keep?

(PHOTOS: A Brief History of the Computer)

Imagine zooming down on a map in a computer browser or smartphone app, summoning enhanced geographical detail within incrementally smaller visible areas. That’s what Yellowstone claims to be able to do for climate projections, narrowing the conventional 60-square-mile units used in climate change modeling today to just seven-square-mile tranches. It’s like increasing the magnification of a microscope, then aggregating all the fine detail to weave a more scrupulous data quilt.

Take what Yellowstone aims to do for research on “thunderstorms and tornadoes,” for instance. According to NCAR:

Scientists will be able to simulate these small but dangerous systems in remarkable detail, zooming in on the movement of winds, raindrops, and other features at different points and times within an individual storm. By learning more about the structure and evolution of severe weather, researchers will be able to help forecasters deliver more accurate and specific predictions, such as which locations within a county are most likely to experience a tornado within the next hour.

Or consider how it could impact “long-term forecasting”:

Farmers, shipping companies, utilities, and other planners would benefit enormously from forecasts that accurately predict weather conditions a month in advance. Because large-scale oceanic and atmospheric patterns play such a major role at this time scale, scientists will rely on supercomputers such as Yellowstone to provide needed detail on the effects of these big patterns on future local weather events. Yellowstone’s size also allows for more ensembles—multiple runs of the same simulation, each with a small change in the initial conditions—that can shed important light on the skill of longer-term forecasts.

NCAR says Yellowstone will also be able to help “work toward the development of seasonal forecasts of sea ice,” improve fire pattern predictions when wildfires break out, locate with more precision gas and oil in areas miles beneath the earth’s surface (as well as subsurface areas that could be used to store carbon) and lay the groundwork for pollutant modeling, which could yield more accurate air quality forecasts days in advance.

Up first, Yellowstone will tackle 11 research projects that NWSC technology developer director Rich Loft says will “try to do some breakthrough science straight away and try to shake the machine” (via Computerworld).

“We want to see what happens when users beat on it instead of just doing acceptance testing.”

MORE: The Collapse of Moore’s Law: Physicist Says It’s Already Happening