Extending the stockpile’s lifespan

When people think about nuclear weapons, most probably picture warheads sitting silently on missiles or within secure hangers near airbases. The bombs themselves appear cold and unchanging.

Nothing could be further from the truth. Nuclear weapons are changing all the time, says Scott Doebling of  Los Alamos National Laboratory, who recently took leave from managing the lab’s Verification and Validation (V&V) program to take an assignment with the National Nuclear Security Administration (NNSA).

Simulating nuclear weapons requires a detailed understanding of material behavior. Here, a large shock will cause solid copper to eject material. The simulation, which ran for 88 hours on the world’s second-fastest supercomputer, Blue Gene/L, described the behavior of 800 million atoms over 1 nanosecond.

“You’re dealing with systems that have radioactive isotopes in them. As the isotopes decay over time, they emit radiation that changes the properties of the materials surrounding them  in the warhead.”

Just how does prolonged exposure to nuclear radiation change a material’s properties? How do those changes alter the way a weapon performs? Will an older bomb still perform as
intended? Will it remain safe and intact if it crashes in an aircraft or burns in a fire?

At the height of the Cold War, the United States had easy answers to those questions. First, it replaced weapons every 10 to 15 years. Second, it tested weapons to ensure they performed
as intended. Finally, America kept adding to its stockpile.

But in 1992, the United States halted underground nuclear testing, reduced the size of the existing nuclear stockpile and stopped making nuclear weapons.

Since the nation no longer built new warheads, it needed to extend the lifespan of the ones it had. It also needed to assess the performance of those aging and reconditioned weapons — and it had to do it without testing.

Today, the Department of Energy relies on computational simulations to predict weapons performance, employing several of the world’s largest supercomputers and some of the
most sophisticated computer models ever created. Yet the models raise questions of their own. They are large, complex combinations of smaller
models. Often, those models are based on different understandings of how physics works. They sometimes deal with temperatures and pressures where the physics are not well
understood. Running such complex software also requires compromises that could change a model’s ability to predict future warhead behavior.

The task of the Los Alamos V&V team is to quantify those uncertainties and improve our confidence in the simulations.

Inherent Problems

Today’s computer models produce incredibly detailed results that appear to precisely
simulate the behavior of weapons and materials as they age.

Yet do they? The uncertainties start with construction of the models themselves.

“Many of the mathematical models we use to understand physics are limited
in what they explain,” Doebling says. One theory, such as electrodynamics, may elucidate the behavior of a few atoms individually. Yet it cannot portray the transfer of heat or force
through a block of material made of those atoms. That would take a second theory, such as thermodynamics or mechanics.

(Visited 1,640 times, 1 visits today)

Page: 1 2

Alan S. Brown

Thomas R. O'Donnell is senior science writer at the Krell Institute and a frequent contributor to DEIXIS.

Share
Published by
Alan S. Brown

Recent Posts

Robot whisperer

A DOE computational science fellow combines biology, technology and more to explore behavior, swarms and space. Read More

July, 2019 9:12 am

Labeling climate

A Berkeley Lab team tags dramatic weather events in atmospheric models, then applies supercomputing and deep learning to refine forecasts. Read More

July, 2019 1:36 pm

Molecular landscaping

A Brookhaven-Rutgers group uses supercomputing to target the most promising drug candidates from a daunting number of possibilities. Read More

May, 2019 1:39 pm

Forged in a Firestorm

A Livermore team takes a stab, atom-by-atom, at an 80-year-old controversy over a metal-shaping property called crystal plasticity. Read More

April, 2019 10:39 am

Visions of exascale

Argonne National Laboratory’s Aurora will take scientific computing to the next level. Visualization and analysis capabilities must keep up. Read More

March, 2019 9:47 am

ARM wrestling

Aiming to expand their technology options, Vanguard program researchers are testing a prototype supercomputer built with ARM processors. Read More

February, 2019 1:24 pm