Pavel Bochev had no idea who he would work with when he moved from Bulgaria to Virginia Tech in 1990 to pursue a Ph.D. in applied mathematics. His master’s thesis in his native country had focused on ordinary differential equations, which generally relate physical quantities to their derivatives, such as a rate of change.

But at Virginia Tech he met Max D. Gunzburger, now the Francis Eppes Distinguished Professor of Mathematics at Florida State University. Gunzburger researched computational fluid dynamics (CFD), which simulates the complex movements of fluids like gases and liquids. To solve these problems, he explored finite elements methods, a way to discretize, or divide, the domain being modeled into pieces, called elements. Solving equations for the physical processes in each element can capture behavior throughout the entire domain.

“I had very little experience in this area,” recalls Bochev, now a distinguished member of the technical staff in the Center for Computing Research at Sandia National Laboratories.

Studying with Gunzburger meant Bochev would have to switch his focus to partial differential equations (PDEs), demanding mathematical problems that contain multiple functions, such as physical quantities, and their partial derivatives.

“This was one of the best decisions I’ve ever made,” Bochev says, “as Max turned out to be an amazing teacher and mentor, and we remain close friends and collaborators to this day.”

It also started Bochev on a distinguished career in applied mathematics, highlighted with the 2017 Thomas J.R. Hughes Medal from the U.S. Association for Computational Mechanics.

Bochev completed his Ph.D. and spent a few years in academia before moving to Sandia’s Albuquerque, New Mexico, lab in 2000. Since then he’s expanded his research beyond CFD to computational electromagnetism, transmission and interface problems, drift-diffusion equations and Darcy flows, which describe fluid movement through a porous medium.

“One of the amazing things about being a numerical analyst in a national lab is that you have an inexhaustible supply of interesting problems to work on,” Bochev says. “The types of simulation required for mission problems often push our numerical methods beyond their theoretical boundaries and take them to uncharted territories.”

For these explorations, Bochev relies heavily on FEM, which he calls “the most mathematically advanced simulation technology for PDEs at our disposal.” It rests on solid mathematical foundations, giving scientists confidence in the resulting simulations, while offering “tremendous flexibility” for computation. For example, FEM provides multiple options to discretize a given PDE. Mathematicians can choose the best one, Bochev explains, “which for a given set of software and modeling requirements will yield the most robust and effective simulation tools.”

His biggest contribution to numerical PDEs, Bochev says, may be recent work on methods to solve equations governing transport. Multiple DOE projects are evaluating his approaches to address problems such as modeling atmospheric contaminants or subsurface flows.

“The problems that we look at come from climate applications, like simulations of the atmosphere, or coal-fired power plants emitting particles and pollutants,” Bochev says. “We apply this optimization idea to better simulate these processes by preserving physically motivated bounds on their solutions.”

Numerical PDE transport also resembles what its name implies – traffic. “Surprisingly enough, similar equations to the ones we use for transport simulations describe traffic flow on highways,” Bochev says. “These equations can explain why everyone slams on their brakes for no reason.” It’s a shock wave. So, the transport PDEs Bochev and his colleagues use are not just for atmospheric applications.

Everyone can envision a traffic jam, but it’s tougher to grasp many of the problems Bochev and other scientists at the national labs address. “DOE uses computer models to understand, predict and verify complex systems in high-consequence analyses that would be difficult or even impossible by other means,” he says. Moving to exascale computing, hundreds of times more powerful than today’s best machines, will let researchers explore in new ways difficult problems incorporating multiple time and length scales and multiple types of physics. They’ll use new discretizations that vary to better fit different parts of a simulation, such as in Earth system models or nuclear waste repository assessments, or in examining the nuclear stockpile’s integrity. This, Bochev notes, requires new numerical methods that address various challenges, such as maintaining the physical properties of a simulation’s components. “Stable and accurate discretizations are not necessarily property-preserving,” he adds.

Combining different numerical systems will also be challenging, a task Bochev compares to building with LEGOs. “Like the bricks have to stick together to build a contraption, this is what it takes to simulate a complex, multiphysics system,” he says. “You need the blocks, and they need to stick together to not collapse.”

Such heterogeneous systems grab Bochev’s attention. Specifically, he wants to use optimization and control ideas to build property-preserving solutions. “There’s a rich mathematical theory of optimization methods that could be leveraged to analyze such methods and establish rigorously their stability and accuracy properties,” he says. “We have developed and demonstrated this idea for the coupling of diverse numerical models.”

In building these models, Bochev also must consider what lies ahead in high-performance computing architectures, which are likely to be drastically different than today. “Understanding the interplay between, for example, programming models on the one hand and numerical algorithms becomes imperative to ensure sustained performance on next-generation platforms.”

Bochev gained new insights into strategies for these evolving computing platforms when he met Nat Trask, then a Brown University graduate student working on mesh-free and particle methods. Now a postdoctoral fellow at Sandia, Trask hooked Bochev on these techniques. “Nat’s enthusiasm for these methods was contagious,” Bochev says, “and I got really interested in this class of discretizations that in many ways is the opposite to FEM.” Now Bochev leads a new Sandia project, working with Trask and several other colleagues to build meshless methods for DOE simulations.

“We work in a field that is constantly being changed by trying to solve more complex problems and incorporate more rich behavior in models.” To keep up, Bochev and his colleagues must keep learning and building new numerical methods.

(Visited 488 times, 1 visits today)

A DOE CSGF recipient at the University of Texas took on a hurricane-flooding simulation and blew away limits on its… Read More

September, 2019

A DOE computational science fellow combines biology, technology and ￼more to explore behavior, swarms and space. Read More

July, 2019

A Berkeley Lab team tags dramatic weather events in atmospheric models, then applies supercomputing and deep learning to refine forecasts. Read More

July, 2019

A Brookhaven-Rutgers group uses supercomputing to target the most promising drug candidates from a daunting number of possibilities. Read More

May, 2019

A Livermore team takes a stab, atom-by-atom, at an 80-year-old controversy over a metal-shaping property called crystal plasticity. Read More

April, 2019

Argonne National Laboratory’s Aurora will take scientific computing to the next level. Visualization and analysis capabilities must keep up. Read More

March, 2019