Simulations at Sandia National Laboratories reveal that using magnetism to heat and insulate fusion fuel could recreate solar conditions in the lab.
Filling in the blanks
To prevent important information from being missed, a Berkeley Lab team is improving how supercomputers divvy up the ponderous tasks surrounding large simulations’ analytics and visualization.
Overcoming resistance
To find a path around antibiotic resistance, a team working with the Intrepid supercomputer at Argonne National Laboratory is simulating molecular binding interactions to rapidly vet new infection-fighting candidates.
Kinky nanotubes
With the help of Oak Ridge computations, scientists are probing the properties of macroscale sponges made of nanoscale carbon-boron tubes. The material could soak up oil spills, help store energy or meet other needs.
A passion for pressure
Plasmas are the purview of Livermore scientist and Computational Science Graduate Fellowship alumnus Jeffrey Hittinger. He works both sides of the fusion street – inertial confinement and magnetic confinement – while simulating aspects of these tremendously hot, fast-moving particle clouds.
Twice-stuffed permafrost
A Pacific Northwest National Laboratory computation suggests that the water-gas compounds found in ocean permafrost can provide energy and store it, too – and then trap carbon dioxide.
Enlightening predictions
Computer simulations of hurricane lightning could be the key to predicting and avoiding the storms’ real-world punch.
Inside the skull
Modeling the elements of blood flow in the brain could help neurosurgeons to predict when and where an aneurysm might rupture – and when to operate.
Power boost
Berkeley scientists have combined computational modeling and advanced materials synthesis to devise a low-cost anode that bolsters the feasibility of long-life lithium-ion batteries.
Seeing beyond 3-D
High-dimensional visualization techniques at Stony Brook and Brookhaven are helping reveal the interactions that drive climate and other complexities.
Helping hydrogen along
Researchers have pursued clean hydrogen-based fuels for years. A Berkeley Lab team hopes to spur that quest with help from one of the world’s most powerful computers.
A long view of Gulf oil spill
While others predicted when oil from the Deepwater Horizon spill in the Gulf of Mexico might reach beaches, ocean modelers at Los Alamos National Laboratory and the National Center for Atmospheric Research asked when gushing oil might exit the Gulf, where it would go and how diluted it’d be, up to a year later.
Pounding out atomic nuclei
Thousands of tiny systems called atomic nuclei – specific combinations of protons and neutrons – prove extremely difficult to study but have big implications for nuclear stockpile stewardship. To describe all of the nuclei and the reactions between them, a nationwide collaboration is devising powerful algorithms that run on high-performance computers.
Laptop supercomputing
A small team led by Sandia National Laboratories is attempting to virtually put the world’s most powerful supercomputers on a user’s own desktop or laptop.
In climate modeling, speed matters
A Brookhaven team wants to build the ‘fast physics’ behind clouds, air-suspended particles and precipitation into global climate models.
Seeing the invisible
Armed with computing power from Oak Ridge National Laboratory, researchers are detailing the nature of dark matter surrounding a galaxy much like our own Milky Way.
Dark matter predictions put to test
Collisions in dark matter “clumps” should produce gamma rays, but a satellite looking for them has come up empty so far.
Parsing particle experiments
A detector suggested dark matter collisions, but no other test has seen similar signs.
Nuclear predictive
Argonne National Laboratory applies mathematics and computation to engineer the next generation of nuclear reactors.
Getting a grip on the grid
A PNNL team enlists new algorithms and powerful computers to quickly analyze which combinations of failures most threaten the power grid.
The master of Monte Carlo
Berni Alder’s Monte Carlo methods have solved problems across the scientific spectrum. Yet the Livermore-based National Medal of Science-recipient still has questions.
Going big to study small
It takes a big computer to model very small things. And, like its namesake state, New York Blue is big. Made up of 36,864 processors, the massively parallel IBM Blue Gene/L is housed at DOE’s Brookhaven National Laboratory (BNL) on New York’s Long Island, where, among other things, it’s used to model quantum dots, or […]
Putting catalysts on track
Computation and experimentation combine to improve and speed design of useful compounds.
Breaking the biomass barrier
What Oak Ridge National Laboratory researchers are learning could help make ethanol from cellulose a viable fuel alternative – and help the United States replace foreign oil with a green, renewable resource.
Extending the stockpile’s lifespan
Just how does prolonged exposure to nuclear radiation change a material’s properties? How do those changes alter the way a weapon performs? A Los Alamos team quantifies these and other uncertainties.