Targeting tumors
Cancer biology gets the supercomputing treatment on Oak Ridge’s Summit and Lawrence Livermore’s Sierra.
Cancer biology gets the supercomputing treatment on Oak Ridge’s Summit and Lawrence Livermore’s Sierra.
The superfacility concept links high-performance computing capabilities across multiple scientific locations for scientists in a range of disciplines.
Pairing large-scale experiments with high-performance computing can reduce data processing time from several hours to minutes.
Nature inspires Oak Ridge National Laboratory algorithms for neuromorphic processors.
Borrowing from the brain Read Post
Sandia National Laboratories researchers seek to connect quantum and classical calculations in a drive for a new supercomputer paradigm.
An Argonne researcher upgrades supercomputer optimization algorithms to boost reliability and resilience in U.S. power systems.
A Los Alamos National Laboratory and Carnegie Mellon University exascale file system helps scientists sort through enormous amounts of records and data to accelerate discoveries.
A DOE CSGF recipient at the University of Texas took on a hurricane-flooding simulation and blew away limits on its performance.
A DOE computational science fellow combines biology, technology and more to explore behavior, swarms and space.
A Berkeley Lab team tags dramatic weather events in atmospheric models, then applies supercomputing and deep learning to refine forecasts.
A Brookhaven-Rutgers group uses supercomputing to target the most promising drug candidates from a daunting number of possibilities.
Molecular landscaping Read Post
A Livermore team takes a stab, atom-by-atom, at an 80-year-old controversy over a metal-shaping property called crystal plasticity.
Forged in a Firestorm Read Post
Argonne National Laboratory’s Aurora will take scientific computing to the next level. Visualization and analysis capabilities must keep up.
Aiming to expand their technology options, Vanguard program researchers are testing a prototype supercomputer built with ARM processors.
Exascale computing, combined with redesigned computational chemistry software, could help researchers develop new renewable energy materials and greener chemical processes.
Revving up chemistry Read Post
Los Alamos tool lends user flexibility to ease use of stockpile-stewardship physics simulations.
Multiphysics models for the masses Read Post
Supercomputing power and increasing genomic data are allowing Oak Ridge researchers to examine drought tolerance in plants and other big biological questions.
A Berkeley Lab-Northwestern University team follows fish movements to build energy-efficiency algorithms.
A supercomputing co-design collaboration involving academia, industry and national labs tackles exascale computing’s monumental challenges.
A Brookhaven National Laboratory computer scientist is building software to help researchers interact with their data in new ways.
A Sandia National Laboratories mathematician is honored for his work creating methods for supercomputers.
Connecting equations Read Post
High-quality computational models of enzyme catalysts could lead to improved access to biofuels and better carbon capture.
A data-management system called PanDA anticipated cloud computing to analyze the universe’s building blocks.
The unusual architecture in Los Alamos National Laboratory’s newest supercomputer is a step toward the exascale – systems around a hundred times more powerful than today’s best machines.