Computational scientists often use Bayesian inference, a mathematical technique, to quantify uncertainty. The method starts with two distributions based on researchers’ prior knowledge: one of the parameters and one of data from observations. It combines them to create a posterior distribution of the parameters – essentially a graph of the probability that a parameter set contains the “true” values generating the observed data.

“All methods boil down to that,” Isaac says. “Each time we want to evaluate a point of that posterior distribution, that requires us to solve our system” of nonlinear equations for the forward ice sheet model. That presents tremendous challenges, the paper says, because standard algorithms require sampling hundreds of thousands of values from the distribution for estimates to converge on an answer.

Other research groups are tackling the inverse problem for ice sheet parameters. “What differentiates our work is we’ve built it around a more accurate forward model,” Isaac says. “We’re not simplifying the physics quite as much,” but that also means solving a more complex set of equations.

Instead of sampling the complex posterior distribution, the technique Isaac developed with his advisor, Omar Ghattas, and others approximates it with a simpler, bell-shaped Gaussian distribution. The algorithm incorporates observational data to calculate the most important point on the curve then redraws the bell curve with that point at the peak.

To guide the search for that most important point, the method also calculates the effects of the Hessian, a mathematical derivative that describes the “curvature” of the posterior distribution. The entire Hessian is too big to directly calculate for large problems, so the method solves a secondary set of equations derived from the forward ice sheet model to calculate only the needed portions.

These Hessian calculations are scalable: The number of steps needed to find the most likely point is generally independent of the problem size. The Hessian also describes the shape of the Gaussian distribution built around the most likely point. “So our methods not only get us that most likely point, but they get us that bell curve,” Isaac adds.

At the method’s heart is a scalable algorithm to solve the nonlinear Stokes equations describing slow-moving, viscous flows – such as an ice sheet. The researchers used a classical method that converts the work into a series of linear equations, each of which is iteratively solved.

In tests on Titan, Oak Ridge National Laboratory’s Cray XK7 supercomputer, the solver scaled well – but not perfectly. It also was slower when preparing the problem for solution. “It’s the setup of the linear solver that’s the bottleneck,” Isaac says. Apart from scaling tests, most calculations ran on Stampede, the Dell PowerEdge cluster at UT’s Texas Advanced Computing Center.

The project focused on demonstrating a scalable technique for inverse problems with uncertainty quantification. The result was maps showing Antarctic regions where changes in the basal friction parameter most affect ice sheet velocity. But the ice sheet models also matched velocity data well, especially in critical coastal regions, when using the inferred basal friction parameters.

Isaac and other members of the Ghattas group are improving the method. The inverse framework is broad enough for use on other problems, Isaac says, and could be made more efficient by accounting for some aspects of the ice problem. They’re also working on improving the Stokes solver’s efficiency on big problems.

Isaac, meanwhile, plans to graduate in the spring. He’s uncertain what the future holds after that.

Page: 1 2

Thomas R. O'Donnell

The author is a former Krell Institute science writer.

Share
Published by
Thomas R. O'Donnell

Recent Posts

Planet-friendly plastics

A Los Alamos team applies machine learning to find environmentally benign plastics. Read More

June, 2023

A split nanosecond

Sandia supercomputer simulations of atomic behavior under extreme conditions advances materials modeling. Read More

May, 2023

A colorful career

Argonne’s Joe Insley combines art and computer science to build intricate images and animations from… Read More

April, 2023

Mapping the metastable

An Argonne National Laboratory group uses supercomputers to model known and mysterious atomic arrangements, revealing… Read More

March, 2023

The human factor

A PNNL team works to improve AI and machine learning tools so that grid operators… Read More

January, 2023

Simulation looped in

CogSim, a machine-learning approach modeled on the brain, coordinates simulations, experiments and data-analysis to yield… Read More

December, 2022