Pounding out atomic nuclei

Nuclear reactions, from fission in reactors to fusion in stars, depend on interactions between protons and neutrons that are building blocks of atomic nuclei.

Describing all of the nuclei and the reactions between them, however, demands powerful algorithms running on high-performance computers.

The Universal Nuclear Energy Density Functional (UNEDF) collaboration, which was created by the Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) program, focuses on developing such descriptions.

An optimized sequence of parameter values in nuclear simulations. (Image courtesy of Stefan Wild.)

The UNEDF collaboration includes researchers from seven national laboratories – Ames, Argonne, Lawrence Berkeley, Lawrence Livermore, Los Alamos, Oak Ridge, and Pacific Northwest – and nine universities: Central Michigan, Iowa State, Michigan State, Ohio State, San Diego State, North Carolina at Chapel Hill, Tennessee-Knoxville, Texas A&M in Commerce and University of Washington. Recently, researchers in this collaboration made a significant advance through the use of density functional theory (DFT).

On Earth, only about 300 kinds of nuclei – specific combinations of protons and neutrons – exist. In accelerators and stars, the number of known nuclei grows to about 3,000, and it could eventually expand to around 6,000. Many of these tiny systems prove extremely difficult to study, largely because they live such short lives before decaying.

Consequently, researchers need ways to accurately simulate these elusive species. Other applications also require extremely precise simulations of interacting nuclei. For example, the National Nuclear Security Administration (NNSA) Stockpile Stewardship Program requires such simulations to assess the safety and functionality of the weapons in the U.S. nuclear stockpile.

Witold Nazarewicz, professor of physics at the University of Tennessee and co-director of UNEDF, describes the basic structure: “A nucleus resembles a droplet of liquid, where there’s a high density inside and a surface area where it drops, and there’s little outside.” Moreover, the quantum behavior of the protons and neutrons at that surface determines the energy of the nucleus and how it interacts with other nuclei. “We need to know how the nuclear energy is generated in a nucleus to use it.”

Talented teamwork

DFT provides an extremely useful, but not necessarily easy, approach to modeling nuclei. For one thing, DFT includes many parameters that must be determined. As Stefan Wild, assistant computational mathematician in the Laboratory for Advanced Numerical Simulations at  Argonne and a fellow in the Computation Institute at the University of Chicago, asks, “What are the best parameters to calibrate these new models to experimental data?”

In the past, scientists searched for the best parameters with what Wild, an alumnus of DOE’s Computational Science Graduate Fellowship, calls “a lot of hand-tuning. They used intuition to pick the values of parameters, ran a simulation, saw how close the answer came to observed data, made small adjustments and ran the simulation again.”

Given the increasing complexity of nuclear simulations, however, “hand-tuning was like looking for a needle in a haystack and far too time consuming to do anything rigorous or thorough.”

As high-performance computing grew more powerful, though, Wild says that “people started thinking about doing something more mathematical” with DFT. For example, Wild and Jorge Moré, an Argonne Distinguished Fellow and director of Argonne’s Laboratory for Advanced Numerical Simulations, developed an algorithm and computer code called POUNDERS (for “practical optimization using no derivatives for sums of squares”).

(Visited 3,847 times, 1 visits today)

Page: 1 2

Mike May

Published by
Mike May

Recent Posts

Efficiency surge

A DOE CSGF recipient at the University of Texas took on a hurricane-flooding simulation and blew away limits on its… Read More

September, 2019

Robot whisperer

A DOE computational science fellow combines biology, technology and more to explore behavior, swarms and space. Read More

July, 2019

Labeling climate

A Berkeley Lab team tags dramatic weather events in atmospheric models, then applies supercomputing and deep learning to refine forecasts. Read More

July, 2019

Molecular landscaping

A Brookhaven-Rutgers group uses supercomputing to target the most promising drug candidates from a daunting number of possibilities. Read More

May, 2019

Forged in a Firestorm

A Livermore team takes a stab, atom-by-atom, at an 80-year-old controversy over a metal-shaping property called crystal plasticity. Read More

April, 2019

Visions of exascale

Argonne National Laboratory’s Aurora will take scientific computing to the next level. Visualization and analysis capabilities must keep up. Read More

March, 2019