You could think of Larry Curtiss, Jeff Greeley and their Argonne National Laboratory colleagues as kind of a catalyst screening committee.
Their computer models can quickly sift candidates for these important industrial and environmental materials, identifying the best ones for chemists to try out in time-consuming laboratory tests. They also can help scientists better understand results from such tests. The combination of computation and experiments is helping accelerate the development and improvement of catalysts.
Computational modeling of chemical processes like catalysis has grown in importance in the 33 years since Curtiss joined Argonne.
“The capability has really tremendously increased” as the Department of Energy boosted high-performance computing at Argonne and other labs, says Curtiss, leader of the Theory and Modeling Group and an Argonne Distinguished Fellow. When that power and improved algorithms are combined, “It’s unbelievable compared to 30 years ago how much faster it can get done and the larger systems with more atoms (researchers can model), but also the accuracy with which we can calculate the properties of molecules and materials.
“We can (now) actually make predictions that can help to guide the experimentalists in making catalysts important to reducing the nation’s energy dependency.”
Catalysts increase the rate of a chemical reaction but are not consumed in the process. They’re everywhere in industry, helping to efficiently produce a multitude of materials and chemicals. They’re in your car’s muffler, helping cut pollutants. They enhance reactions that generate power in a fuel cell, help convert grains, cellulose and animal and vegetable oils to biofuels and help clean polluted soil and water. Good catalysts cut the heat and energy needed to cause a reaction. That — and the fact catalysis can reduce the creation of wasteful byproducts in chemical reactions — also makes them easier on the environment than other chemical processes.
Scientists seek several properties in an improved catalyst, Curtiss says. It has to be selective, breaking certain chemical bonds but not others. If the catalyst isn’t selective, it could generate unwanted byproducts that might have to be removed. A good catalyst also has to break those bonds in a way that uses less energy than the process scientists want to replace. The new catalyst also must be stable, so it doesn’t degrade, and be as inexpensive as possible.
The research falls under Argonne’s Center for Nanoscale Materials almost by default, says Greeley, a materials scientist at the center. “In any industrial process, the catalysts are essentially on the order of nanometers. The urgent question is how the size and shape of these particles affect their catalytic properties. In many cases you can obtain significantly improved properties by finding the right size or shape.”
Chemists typically have used a time-consuming trial-and-error approach to finding good catalysts, Curtiss says. “Oftentimes, people don’t really understand how they work.” With the computational models he, Greeley and Argonne materials scientist Peter Zapol have devised, they attempt “to really understand at the molecular and atomic level what is happening for these catalysts. If you can understand what’s happening, you can design new catalysts that are more efficient and more selective.”
The codes Greeley, Curtiss and Zapol develop and use are based on first principles: They calculate what’s happening in a catalytic reaction based on fundamental physical properties, so they don’t have to fit the model to experimental data and can make fewer assumptions or estimates about what’s happening.
“That is an extremely powerful technique, in my opinion, but I admit I’m a little bit biased,” Greeley jokes. With a first-principles approach, the computer model can calculate the energy needed for a catalytic reaction, known as the activation energy, and predict which catalysts lower that energy barrier the most and with the best selectivity. The first-principles methods can study all the elemental reactions between atoms “and figure out which steps are more energetically favorable,” Greeley says. That’s difficult to understand using purely experimental approaches.
“The first-principles strategy by its very nature is molecular-scale modeling,” he adds. “It gives you lots of insight into what the molecules and atoms are doing on the surface of these catalysts.”
First principles also helps understand the effect of reaction conditions, like temperature and pressure — another aspect of catalysis that is gaining more attention in research circles, Greeley says. When those environmental properties change, “that can sometimes cause a real change in the structure of the catalyst,” he adds. First principles can help decipher how those changes affect the catalysis mechanism and the activation energy.
The level of detail first principles provides comes at a high cost in computational power, however, and it would be impossible without major computing access. Compute time “increases quite rapidly with the number of electrons that you’re considering and the number of atoms you’re considering in the calculations.” Curtiss says. “As far as doing first principles, it limits the number of atoms that can be included” in the simulation.
For instance, the researchers may easily model a platinum catalyst composed of clusters of four or eight atoms. But if the simulation includes the support material, like aluminum oxide, on which the platinum is deposited, “the computational time really starts increasing,” Curtiss says.
Increasing the cluster to nanometer size or working with a two-element catalyst like vanadium oxide speeds calculation even further.
The researchers tap several computers to run their catalyst codes, including Carbon, a 1,000-plus-cores cluster at the Center for Nanoscale Materials capable of up to 10 teraflops (trillion floating point operations per second); Jazz, a 350-node cluster at Argonne; and computers at the National Energy Research Scientific Computing Center (NERSC) near San Francisco and the Pacific Northwest National Laboratory in Washington state.
First-principles codes are difficult to run efficiently on more than a few dozen processors, so the researchers have yet to make much use of the Argonne Leadership Computing Facility Blue Gene/P, one of the world’s fastest for open computing. However, Greeley and colleagues at the Technical University of Denmark are working on a first-principles code that scales to thousands of processors .
“It looks promising,” Curtiss says. “We hope in the near future to start using Blue Gene.”
A longer version of this article appears in the 2008-09 print issue of DEIXIS.
