Animating algorithms

Supercomputing drives visualization for science, entertainment alike

This article appears in a recent ORNL Reporter newsletter. For more information on ORNL and its research and development activities, please refer to one of our Media Contacts.

A Dreamworks research group received time on ORNL’s Jaguar supercomputer to work out algorithms used in this summer’s feature, Kung Fu Panda.
  A Dreamworks research group received time on ORNL’s Jaguar supercomputer to work out algorithms used in this summer’s feature, Kung Fu Panda.
What do a potential drug for the treatment of Alzheimer’s disease, supernova explosions, and Kung Fu Panda have in common?

Jaguar, ORNL's Cray XT4 supercomputer, crunched numbers for all of these projects, changing the way animators make movies and the way scientists look at their data.

Jaguar, named the second fastest system in the world in early 2007, recently received an upgrade that doubles its performance. It now uses more than 31,000 processing cores and does up to 263 trillion calculations per second (263 teraflops). This powerful supercomputer performs some of the world's most complex computer simulations, revolutionizing research.

It also played a part in Dreamworks Animation's Kung Fu Panda, a movie about Po, a clumsy panda bear who wants to be a kung-fu warrior. Sean Ahern of the Center for Computational Sciences says a research group from Dreamworks developed image generation algorithms with Jaguar that were used in their latest movie.

What they had previously done, Sean says, was render (generate the image from the computer model) in batch mode, with lots of animators working with low-resolution models to try things out. They then had to pick their favorites and render higher resolution images overnight. This took a lot of time.

Dreamworks researcher Evan Smyth wanted to come up with a new breed of image-generation algorithms, taking advantage of multiple processing cores to dramatically accelerate the rendering process and reflect changes to the model in real time. So he got in line with other researchers vying for time on the most powerful supercomputer available for open research, Jaguar.

Use of Jaguar's computing time is determined through DOE's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. Universities, private industry and government research laboratories apply for computing time amounting to more than 100 million processor hours.

Scientists use visualization for increased understanding of data, and animators use it for entertainment. “Animators don't do it to help you understand the science; they do it to tell the story," Sean says, “but they drive realistic image generation."

Realism in image generation comes down to one thing—light. Getting the play of light right—its behavior as it reflects, refracts and diffuses—turns out to be the key factor in making a computer-generated image appear real.

The calculations involved in ray tracing, or tracing the theoretical path of light rays as they interact with the surfaces of the model, “make image generation incredibly computing intensive," says Sean. Calculating millions of bounces for each ray of light, with different degrees of penetration, diffusion, reflection or refraction at every surface they encounter, can take days to compute, depending on the complexity of the environment being modeled.

To illustrate his point, Sean shows two movies he has made of the ITER fusion reactor, the international project to develop new sources of electrical power.

The movie shows radio frequency energy bouncing around in the reactor.

The first movie uses a simple lighting model that takes only a few minutes to render. The second movie uses a lighting model that adds reflections and shadows, increasing the rendering time to about two days. The added detail increases the reality, putting you inside the reactor while the energy is reflecting on the shiny metal walls.

Says Sean, "You can tell stories that you could not tell 20 years ago by making movies of processes that can't be filmed." This is good for scientists because "it provides more tools for conveying data to our human visual system, presenting it in a form that is more easily understood by the human brain."

The Lab members at the Center for Computational Sciences have given several researchers their first glimpses of data from research projects they have put months or even years of effort into, and the images often flip a switch, helping the scientists draw new information from the data.

"They just didn't get the whole picture until they saw the rendering," says Sean.

Ed Uberbacher of the Biosciences Division benefited from the center's help in visualizing his research project. Looking at his model of a new drug that could help to treat Alzheimer's, he noticed that some small additions and changes in the chemical structure of the drug would increase its ability to interact with the proteins that form the Alzheimer's-inducing plaques in the brain.

Sean also recalls the Physics Division's Tony Mezzacappa also having an a-ha moment when he looked at the data visualization of a simulated supernova explosion.

He saw an asymmetry, a lop-sidedness to the rotation that was surprising. The data represented a possible explanation for instabilities in the shock waves formed during the core-collapse and subsequent explosion of massive stars.

The mechanism behind the explosion of a supernova is one of the most important unsolved problems in astrophysics and has been the subject of numerical simulations for more than three decades. The new information in Tony's simulation was lurking in the numbers all along but was not understood until it was presented in visual format.

The goal is "to deliver images that someone can understand, to do it better, more easily and faster," Sean says. "If I can do that, it means that we are reducing the time to understanding."—Sarah Wright