Permafrost creates a polygonal landscape, irregularity that makes simulating thawing’s impact on climate change a challenge requiring advanced algorithms and high-performance computers. (Photo: Konstanze Piel, Alfred Wegener Institute.)

Permafrost – soil that stays frozen year-round – comprises about one-quarter of the Northern Hemisphere’s land. For now.

Climate changes are turning permafrost into mush over an increasing area. Besides creating soggy soil, this melting generates warming that drives further climate change.

No one is sure, however, how much impact this feedback loop has. Finding the answer demands high-performance computing, advanced algorithms and, as it turns out, some unexpected interactions.

That includes the intersecting careers of two Department of Energy Computational Science Graduate Fellowship (DOE CSGF) recipients. The first was Glenn Hammond, now a computational geohydrologist at Sandia National Laboratories in New Mexico. As a graduate student, Hammond served a 1999 practicum with Peter Lichtner, a multiphase flow and reactive-transport modeling expert at Los Alamos National Laboratory. Hammond’s project involved building PFLOTRAN, a parallel-processing version of the existing FLOTRAN code that models how groundwater reacts with the soil and rock it flows through. Parallel processing parcels pieces of a computational task out to multiple computer processors that work simultaneously to quickly complete the job.

“When you take existing code and try to parallelize it, it can be a bit of a challenge,” Hammond says. “As a new student learning parallelization techniques and new code, I started developing my own code from scratch, modeled very similar to FLOTRAN.”

Later, Richard Mills, who recently joined Intel Corp. after working as a computational scientist at Oak Ridge National Laboratory, spent his 2003 DOE CSGF practicum working under Lichtner to continue Hammond’s PFLOTRAN work. Mills implemented the code’s initial solver for groundwater flow.

Many other scientists also contributed to PFLOTRAN over the years and, Mills says, “it turned into a fairly comprehensive code for modeling surface and subsurface hydrology.” In other words, it simulates water and the associated biogeochemical reactions that occur as it moves over or through soil.

Coding the Equations

Although the code is designed to run in parallel on high-performance computers, Mills says the team wanted to make PFLOTRAN flexible enough to run on both the biggest supercomputers and on laptops.

Hammond and Mills grew up in a C and C++ coding world, but most scientists who study hydrology use an older language, FORTRAN. The two former fellows followed that tradition for PFLOTRAN but enlisted modern FORTRAN features to add a degree of object-oriented structure more characteristic of C or C++ code.

(Visited 1,774 times, 1 visits today)

Page: 1 2

Mike May

Published by
Mike May

Recent Posts

Robot whisperer

A DOE computational science fellow combines biology, technology and more to explore behavior, swarms and space. Read More

July, 2019 9:12 am

Labeling climate

A Berkeley Lab team tags dramatic weather events in atmospheric models, then applies supercomputing and deep learning to refine forecasts. Read More

July, 2019 1:36 pm

Molecular landscaping

A Brookhaven-Rutgers group uses supercomputing to target the most promising drug candidates from a daunting number of possibilities. Read More

May, 2019 1:39 pm

Forged in a Firestorm

A Livermore team takes a stab, atom-by-atom, at an 80-year-old controversy over a metal-shaping property called crystal plasticity. Read More

April, 2019 10:39 am

Visions of exascale

Argonne National Laboratory’s Aurora will take scientific computing to the next level. Visualization and analysis capabilities must keep up. Read More

March, 2019 9:47 am

ARM wrestling

Aiming to expand their technology options, Vanguard program researchers are testing a prototype supercomputer built with ARM processors. Read More

February, 2019 1:24 pm