Synergistic R&D

Shrinking the product life cycle

Got the latest smartphone? The lightest mountain bike? The longest range electric car? Too bad. They’re soooo 1994.

The ugly truth for technophiles and neo-Luddites alike is that it takes about 18 years for the average product to move from conception to production. That’s a big problem for scientists who are trying to address the pressing problems facing the world in the areas of energy and the environment.

Synthetic polymer chemist Deanna Pickel investigates materials designed to increase the efficiency of solar cells. Photo: Jason Richards

Synthetic polymer chemist Deanna Pickel investigates materials designed to increase the efficiency of solar cells. Photo: Jason Richards

“Eighteen years is too long to wait,” says Bobby Sumpter, a chemical physicist who works at ORNL’s Center for Nanophase Materials Sciences. “That’s a fact Congress recognized when it established the Materials Genome Initiative. This program is aimed at integrating experimentation, computing, simulation and fabrication in a synergistic fashion to shorten the product life cycle to perhaps five years.”

More computing, less serendipity

Sumpter explains that one of the most powerful tools researchers have for accelerating the R&D process is high-performance computing and its ability to simulate and screen millions of possible materials to find those that are best for a particular application. “For example,” Sumpter says, “if I wanted to make a greatly improved electronic device with particular characteristics, it may not be practical to simply keep trying different materials because testing each device can take a lot of time. I might accidentally stumble on the right material, but that’s sort of like winning the lottery. Serendipity happens, but it’s not a good strategy for materials development and design or for solving many of the current materials problems.”

Fortunately, high-performance computing can allow this selection process to be guided by a set of calculations and computer simulations that analyzes the physical characteristics of each material and enables researchers to narrow the myriad possibilities to a handful that meet their specifications. Sumpter notes wryly that “it’s a lot easier to make three and test three than it is to make a million and test a million.”

Currently, computers can sort through some material properties better than others, based on the available data and computational approaches. For example, one problem that lends itself to computational analysis is the selection of materials that might be good candidates for use in battery cathodes. Scientists can consult alarge database of material structures to enable the computation of electronic properties for selecting the most promising candidates fora particular battery configuration. Ensuring that battery components, such as cathodes, have the potential for improved capacity is one important step along the product development line for manufacturers of products ranging from tablet computers to electric vehicles.

Applied simulation

Simulations also allow researchers to investigate alternatives to increasingly scarce elements that are commonly used in industrial applications, such as electronics, energy conversion (solar panels, catalytic materials, magnets for wind farms), and automotive and aircraft production. In order to build useful simulations of materials, researchers need to have a thorough understanding of their structures. For example, many of the materials that Sumpter and his colleagues study are crystalline—meaning that their structures have well-defined characteristics that repeat throughout the material. This regularity, generally speaking, makes them easier to simulate. Their regular structure of noncrystalline materials or materials with structural defects, on the other hand, requires more complicated models.

As it turns out, these irregularly structured, materials have shown considerable promise as substitutes for hard-to-find elements and in other novel applications.

Sumpter notes that carbon, one of the most abundant elements on the planet, can be modified to perform in a number of unexpected ways. Applications of this versatile material range from lighter, stronger car parts to thermal insulation. Sumpter and his colleagues have made progress in adapting carbon to new purpose by developing simulation tools that allow them to better understand the properties of carbon-based materials with structural defects as well those of amorphous and composite materials.

These tools enable materials researchers to use computer simulations to “tweak” the properties of carbon to change its characteristics—a process that would be more difficult and time-consuming in a laboratory. “Imagine we have a sheet of graphene, a type of carbon.” Sumpter says. “A single sheet has very interesting electronic properties. If we cut the sheet into smaller pieces, such as nanoscale ribbons, they have interesting magnetic properties. However, if one graphene ribbon comes into contact with another, the desirable properties can disappear—unless the sheets are aligned in a particular way. You might not know about any of these characteristics without first-principles simulations. This is a situation where using a computer model helps us take a nanoscale property and apply it at the mesoscale and the macroscale. Then, guided by the simulation, we can better determine how to achieve the same results in the laboratory.”

A good example of what can be accomplished through the use of simulation and engineered carbon structures is a 3-D carbon nanotube sponge recently devised by a multi-institutional research team that included Sumpter. “It absorbs oil exceptionally well,” Sumpter says. “It’s also cheap, renewable and can be grown in large quantities.” The nanotube sponge illustrates how nanoscale properties can translate into a useful macroscopic structures—in this case into a material that can absorb 10 to 100 times its weight in oil.

The discovery of the nanosponge was a result of an effort to grow clumps of carbon nanotubes by introducing boron atoms into the network of carbon atoms that make up nanotubes. Simulations of
various arrangements of the new atoms indicated that they would result in “elbow” junctions in the nanotubes that would cause them to grow into a sponge-like 3-D network—which turned out to be correct.

In addition to being super absorbent and far more efficient than other materials commonly used for oil remediation, the nanosponges are tough. The absorbed oil can be wrung out of them and recovered,
or burned out—either way the sponges can be recycled and used over and over again.

Simulations have also shown how carbon-based systems can be useful for the catalytic production of synthetic fuels. Such a process typically depends on transition metal catalysts to convert
a gaseous mixture of carbon monoxide and hydrogen into liquid hydrocarbons. However, recent simulations have found that carbon based materials with specific defects can promote the catalytic
process more effectively.

Collaboration and codesign

Working directly with industry is another way to compress the product development life cycle. These research relationships often involve not only laboratory work but also theory, modeling and simulation aimed at enhancing the properties of materials. Collaboration among researchers from all stages of the product life cycle can speed a product toward production just as surely as employing computer simulations.

Sumpter notes that a good example of this kind of collaborative design can be seen in the preparations researchers are making to move from the current generation of supercomputer to the next one, which will be a thousand times more powerful. “We don’t just want a faster computer,” he says, “we want it to be usable on critical problems as soon as the hardware is available. We can’t develop a computer and spend 10 years writing software to take advantage ofit. If we do that, the product life cycle hasn’t been changed.”

Sumpter says that to be sure they’re ready on day one, the strategy is to employ a process of co-design.”That means experimental scientists; computational scientists; the computer manufacturer’s hardware designers; and the mathematicians, computer scientists, and engineers who apply the results of research are all working directly with one another to design viable next-generation computers. The hope is that this process will result in a system that, from day one, can be used by scientists and engineers to solve critical problems.

Sumpter suggests that a similar model could be used to improve efficiency in any area of science. It’s just a matter of clearly defining what needs to be accomplished at the end of the process. “Sometimes the key to success is finding a way of expressing ideas that can be understood among all the different disciplines,” he says. “It’s not trivial; it’s a cultural thing.”

A new approach Sumpter foresees that efforts in the areas of computation and collaboration will radically change the traditional approach to the design and deployment of materials and will shorten the time required to bring a concept to market.

“These changes will result from the integration of theory, computing, characterization and synthesis, from multiple disciplines,” he says. “The way a physicist looks at a material is very different from the way a chemist or biologist looks at it, and this expanded perspective has already enabled us to overcome some tough challenges.”

“We have the facilities; we have people with really good ideas and lots of energy, and we have problems that need to be solved. There’s no reason we can’t go forth and solve them.” —Jim Pearce