Categories: ArgonneFeatured

Inside the skull

The geometry and some blood flow conditions were derived from a real patient. Blood properties (viscosity, for example) were based on data for an average person. With this information, the researchers used ANL and ORNL computers to refine the code to run successfully on HPCs. “A simulation run on an HPC such as Blue Gene/P can involve a very large number of cores – more than 131,000 – and take almost 25 hours nonstop to complete for just one heart beat,” Grinberg says. “These simulations take a long time to run even on an HPC, and you don’t want the code to fail after only a minute or one hour.”

These are monumental computations. Modeling just 1 cubic millimeter of human blood involves 1 billion to 2 billion dissipative particle dynamics, or DPDs. “DPDs involve a technique to simulate the dynamics of clusters of particles, instead of each particle individually,” Morozov says. “In other words, each DPD particle represents a set of many particles and is treated as if it is a big individual particle.”

From centimeters to microns

“We have been able to resolve simulations at multiple scales, not just one as others have done, and in parallel fashion,” Grinberg says. In other words, the team’s methods make it possible for the first time to simultaneously see a snapshot of blood flow throughout the entire brain on a scale that ranges from as large as 4 to 5 centimeters down to less than a micron.

“This is a truly unique advance in computational science,” Morozov says. “Many different approaches have been suggested in the past, but none of them really succeeded in getting practical results.”

Grinberg expects that the methods the group developed also will be useful for simulating other subjects, such as crack propagation in aircraft surfaces at the molecular level. “This type of research overlaps very well with what we are trying to do, but we are working on a different problem: We model blood cells as they aggregate and interact on the larger scales.”

So what’s the outlook for multiscale simulations in medicine? Grinberg says researchers must increase model complexity and devise codes that are fast enough to perform longer multiscale simulations than now possible. Hospitals and other medical institutions are turning more and more to computing to model different problems and to predict the consequences of surgery.

But until exaflop computers – roughly a thousand times more powerful than today’s best machines – arrive in a decade or more, it’s unlikely front-line doctors can use real-time simulations and visualizations like the relatively simple, early ones Grinberg and others are developing. Nonetheless, Grinberg predicts, future doctors will receive training that incorporates HPCs in their practices, and hospitals will staff specialists who can tap into computing resources at national centers.

Says Ginberg, “The day will come when we’ll be able to develop software that can enable very sophisticated models and simulations – how many more types of cells interact, for example – and physicians will have much better understanding of how to treat their patients.”

Page: 1 2

Share
Published by

Recent Posts

Planet-friendly plastics

A Los Alamos team applies machine learning to find environmentally benign plastics. Read More

June, 2023

A split nanosecond

Sandia supercomputer simulations of atomic behavior under extreme conditions advances materials modeling. Read More

May, 2023

A colorful career

Argonne’s Joe Insley combines art and computer science to build intricate images and animations from… Read More

April, 2023

Mapping the metastable

An Argonne National Laboratory group uses supercomputers to model known and mysterious atomic arrangements, revealing… Read More

March, 2023

The human factor

A PNNL team works to improve AI and machine learning tools so that grid operators… Read More

January, 2023

Simulation looped in

CogSim, a machine-learning approach modeled on the brain, coordinates simulations, experiments and data-analysis to yield… Read More

December, 2022