Extending the stockpile’s lifespan

When people think about nuclear weapons, most probably picture warheads sitting silently on missiles or within secure hangers near airbases. The bombs themselves appear cold and unchanging.

Nothing could be further from the truth. Nuclear weapons are changing all the time, says Scott Doebling of  Los Alamos National Laboratory, who recently took leave from managing the lab’s Verification and Validation (V&V) program to take an assignment with the National Nuclear Security Administration (NNSA).

Simulating nuclear weapons requires a detailed understanding of material behavior. Here, a large shock will cause solid copper to eject material. The simulation, which ran for 88 hours on the world’s second-fastest supercomputer, Blue Gene/L, described the behavior of 800 million atoms over 1 nanosecond.

“You’re dealing with systems that have radioactive isotopes in them. As the isotopes decay over time, they emit radiation that changes the properties of the materials surrounding them  in the warhead.”

Just how does prolonged exposure to nuclear radiation change a material’s properties? How do those changes alter the way a weapon performs? Will an older bomb still perform as
intended? Will it remain safe and intact if it crashes in an aircraft or burns in a fire?

At the height of the Cold War, the United States had easy answers to those questions. First, it replaced weapons every 10 to 15 years. Second, it tested weapons to ensure they performed
as intended. Finally, America kept adding to its stockpile.

But in 1992, the United States halted underground nuclear testing, reduced the size of the existing nuclear stockpile and stopped making nuclear weapons.

Since the nation no longer built new warheads, it needed to extend the lifespan of the ones it had. It also needed to assess the performance of those aging and reconditioned weapons — and it had to do it without testing.

Today, the Department of Energy relies on computational simulations to predict weapons performance, employing several of the world’s largest supercomputers and some of the
most sophisticated computer models ever created. Yet the models raise questions of their own. They are large, complex combinations of smaller
models. Often, those models are based on different understandings of how physics works. They sometimes deal with temperatures and pressures where the physics are not well
understood. Running such complex software also requires compromises that could change a model’s ability to predict future warhead behavior.

The task of the Los Alamos V&V team is to quantify those uncertainties and improve our confidence in the simulations.

Inherent Problems

Today’s computer models produce incredibly detailed results that appear to precisely
simulate the behavior of weapons and materials as they age.

Yet do they? The uncertainties start with construction of the models themselves.

“Many of the mathematical models we use to understand physics are limited
in what they explain,” Doebling says. One theory, such as electrodynamics, may elucidate the behavior of a few atoms individually. Yet it cannot portray the transfer of heat or force
through a block of material made of those atoms. That would take a second theory, such as thermodynamics or mechanics.

(Visited 1,667 times, 1 visits today)

Page: 1 2

Alan S. Brown

Thomas R. O'Donnell is senior science writer at the Krell Institute and a frequent contributor to DEIXIS.

Published by
Alan S. Brown

Recent Posts

Scaling up metadata

A Los Alamos National Laboratory and Carnegie Mellon University exascale file system helps scientists sort… Read More

May, 2020

Pandemic view – plus privacy

A fellow helps guide an international volunteer effort to develop COVID Watch, a mobile telephone… Read More

April, 2020

Earth’s multimodel future

A PNNL-led team's mission: integrate climate models to reflect human impact on natural resources –… Read More

February, 2020

Beyond the tunnel

Stanford-led team turns to Argonne’s Mira to fine-tune a computational route around aircraft wind-tunnel testing. Read More

January, 2020

Efficiency surge

A DOE CSGF recipient at the University of Texas took on a hurricane-flooding simulation and… Read More

September, 2019

Robot whisperer

A DOE computational science fellow combines biology, technology and more to explore behavior, swarms and… Read More

July, 2019