Complex simulations are unraveling the idiosyncrasies of this unique liquid’s behavior.
Connecting equations
A Sandia National Laboratories mathematician is honored for his work creating methods for supercomputers.
Enhancing enzymes
High-quality computational models of enzyme catalysts could lead to improved access to biofuels and better carbon capture.
Cloud forebear
A data-management system called PanDA anticipated cloud computing to analyze the universe’s building blocks.
Booting up Trinity
The unusual architecture in Los Alamos National Laboratory’s newest supercomputer is a step toward the exascale – systems around a hundred times more powerful than today’s best machines.
Materials cookbook
A Berkeley Lab project computes a range of materials properties and boosts the development of new technologies.
No passing zone
Lawrence Livermore National Laboratory models the blood-brain barrier to find ways for drugs to reach their target.
Chipping away
Redirecting an old chip might change the pathway to tomorrow’s fastest supercomputers, Argonne National Laboratory researchers say.
Nanogeometry
With a boost from the Titan supercomputer, a Berkeley Lab group works the angles on X-rays to analyze thin films of interest for the next generation of nanodevices.
Forecasting failure
Sandia National Laboratories aims to predict physics on a micrometer scale.
Early-universe soup
ORNL’s Titan supercomputer is helping Brookhaven physicists understand the matter that formed microseconds after the Big Bang.
Multitalented metric
An alternative computing benchmark emerges to reflect scientific performance.
Bright future
The smart grid turns to high-performance computing to guide its development and keep it working.
Powering down
PNNL team views ‘undervolting’ — turning down the power supplied to processors — as a way to make exascale computing feasible.
Analysis restaurant
The AnalyzeThis system deals with the rush of huge data-analysis orders typical in scientific computing.
Noisy universe
Berkeley Lab cosmologists sift tsunamis of data for signals from the birth of galaxies.
Layered look
With help from the Titan supercomputer, an Oak Ridge National Laboratory team is peering at the chemistry and physics between the layers of superconducting materials.
Bits of corruption
Los Alamos’ extensive study of HPC platforms finds silent data corruption in scientific computing – but not much.
Sneak kaboom
At Argonne, research teams turn to supercomputing to study a phenomenon that can trigger surprisingly powerful explosions.
Slippery subject
University of Texas researchers are out to improve computational models of ice sheets by refining estimates of basal friction: how much rocks and earth slow the sheet’s movement.
A smashing success
The world’s particle colliders unite to share and analyze massive volumes of data.
Back to the hydrogen future
At Lawrence Livermore National Laboratory, Computational Science Graduate Fellowship alum Brandon Wood applies the world’s most sophisticated molecular dynamics codes on America’s leading supercomputers to model hydrogen’s reaction kinetics.
Joint venture
Sandia National Laboratories investigators turn to advanced modeling to test the reliability of the joints that hold nuclear missiles together.
Life underground
A PNNL team builds models of deep-earth water flows that affect the tiny organisms that can make big contributions to climate-changing gases.
Supernova shocks
More than 10 years after simulations first suggested its presence, observations appear to confirm that a key instability drives the shock behind one kind of supernova.