Categories: Argonne

Nuclear predictive

Imagine a new-generation nuclear plant – inherently safer, more efficient and able to refine petroleum or manufacture plastics while producing energy.

This animation shows early-time pressure distribution for simulation of coolant flow in a 217-pin wire-wrapped nuclear reactor fuel subassembly, computed on 32,768 processors of the Argonne Leadership Computing Facility’s Blue Gene/P. The program used nearly 1 billion data points distributed through the simulated subassembly to calculate properties like pressure and temperature over time.

Such power plants would reduce America’s reliance not just on foreign oil but also on fossil fuels in general; while heating and lighting millions of homes, they would simultaneously replace some of the most energy-intensive processes in American manufacturing.

That’s the potential of technology that would improve the United States’ water-cooled reactors. The technology is promising as America moves toward more advanced gas-cooled reactors.

One hurdle is to win the hearts of Americans who remember the Three Mile Island accident and Chernobyl nightmare.

But another huge hurdle is cost – how to justify investing billions of dollars in constructing a new-generation plant with no guarantee that the BTUs produced would make that investment a sound one in the long run.

The construction costs of a nuclear power plant are enormous, but so are the costs of research – the painstaking hours, months and years invested in analyzing the interactions of neutronics, fluid mechanics, and structural mechanics in order to predict the behavior of the reactor throughout its lifetime.

Potential investors in next-generation reactors and the U.S. Department of Energy are counting on the synergistic efforts of reactor designers, computational scientists and applied mathematicians to find ways to analyze reactor flow through simulation – capitalizing on the power of high-performance computers.

One of the people leading the way is Paul Fischer, an applied mathematician and mechanical engineer who works in the Mathematics and Computer Science Division at DOE’s Argonne National Laboratory near Chicago.

Fischer uses millions of hours of computer processing time on the IBM Blue Gene/P and a unique code to make detailed simulations of coolant flow in a reactor. His modeled device is about the size of a long, narrow mailbox, packed with 217 fuel pins and about 1 billion data points.

Any one of a number of properties can be calculated at each data point – temperature, pressure, turbulence and velocity.

It takes 65,000 processors working eight hours a day for 16 days, crunching numbers and information, to understand what the entire mailbox-sized device is experiencing at those pressures and temperatures.

And when Argonne gets its next-generation IBM Blue Gene computer, expected to be among the world’s fastest, Fischer’s nuclear reactor flow simulation will be one of the first applications to run on it.

Fischer’s large-eddy simulations of flow in sodium-cooled reactors are 100 times more ambitious than any done before. The work is designed to run at petascale speeds – more than 1 quadrillion floating-point operations a second – and to provide detailed information about heat transfer within the core.

The aim is to demonstrate that the temperature inside a helium-cooled or sodium-cooled reactor can be reliably predicted. That’s crucial, because if plant operators have confidence in the precise temperature, they can run nuclear reactors at higher power levels without compromising safety – resulting in reduced energy costs.

In Fischer’s simulation, coolant passes through interior flow channels between the 217 pins – each of which has a single wire spiraling around it – and through corner channels between the pins and the walls.

By exploiting symmetries, and by virtue of the relatively short entrance length for the flows, Fischer can simplify the calculations so that only a single wire pitch must be simulated.

An aging fleet

America’s operating nuclear power plants were built in the 1960s and 1970s, but they are using technology that is even older – circa 1940s and 1950s.

“These plants were constructed when we didn’t even have desktop computers,” says Tim Tautges, a computer scientist and head of Argonne’s SHARP project, which is developing high-accuracy simulation tools for reactors. “The basis for their designs was largely experimental – they performed a great deal of experiments to characterize the behavior of nuclear reactors.”

Page: 1 2

Bill Scanlon

The author is a former Krell Institute science writer.

Share
Published by
Bill Scanlon

Recent Posts

Genomic field work

A UC Davis fellow combines computing, the corn genome and growing data to help farmers… Read More

March, 2024

Tracking space debris

 A Computational Science Graduate Fellowship recipient monitors threats to satellites in Earth’s ionosphere by modeling… Read More

February, 2024

Decisive achievement

A computational sciences fellow models COVID-19 virus variants and examines how people weigh complex decisions. Read More

October, 2023

Learning climate

A Colorado State fellow employs machine learning for climate modeling, putting provenance behind predictions. Read More

August, 2023

Statistically significant

A LANL statistician helps cosmologists and epidemiologists grasp their data and answer vital questions. Read More

July, 2023

Cutting carbon, blocking blooms

Besides bioplastics research, the LANL Biofuels and Bioproducts team is studying carbon neutrality and applying… Read More

June, 2023