Categories: Oak Ridge

Seeing the invisible

This frame from the Via Lactea II visualization shows the dark matter halo as it might look today, more than 13.7 billion years since the Big Bang. Gravity has drawn the partcles into dense clumps, which retain much of their stucture as they are pulled toward the halo’s center. The color scale shows dark matter density increasing from blue to white.

The model indicates that hundreds of concentrated dark matter clumps survive even near our solar system. Earlier simulations with coarser resolution – larger representative particles – portrayed dark matter in this area as smooth.

Via Lactea and other simulations are helping make the case for the cold dark matter model, Kuhlen says. If dark matter was hotter, its thermal velocity would make for a smooth structure, rather than clumpy. Apart from that, Via Lactea says little about just what kind of particles comprise dark matter, because it’s designed to resolve dark matter’s gravitational effects.

“We’re concerned with how dark matter is laid out on a larger scale than one in which individual particle physics would matter. At the same time, it doesn’t mean (the simulation is) irrelevant for experiments that attempt to directly detect dark matter particles.”

In fact, the group’s findings could help researchers understand experiments that attempt to directly detect rare collisions between dark matter particles and heavy atomic nuclei, like iodine, xenon or tungsten.

Via Lactea II produced a huge amount of data: about 20 terabytes (20 trillion bytes) – enough to fill about 100 average-sized personal computer hard drives. That’s tiny, however, compared to the 500 terabytes expected to come out of the next version, tentatively named Silver River, the East Asian nickname for the Milky Way.

Instead of dark matter particles about 4,000 solar masses big, researchers plan to track particles just 100 solar masses in size, Kuhlen says. “It’s like skipping a generation. It’s also why it’s causing us trouble. It’s a very big jump.”

The researchers have struggled with establishing the simulation’s initial conditions – the parameters for its starting point. The obstacle kept the group from using the bulk of a 2009 INCITE grant of 5 million processor-hours on Jaguar. The grant has been extended through 2010.

Led by Jürg Diemand, Joachim Stadel and Potter, the team is continuing to develop PKDGRAV2 and tackle the initial conditions problem. “The sheer size and high resolution of the initial conditions presented us with several new problems,” Stadel says.

It’s tricky, because the researchers must zoom in on just a tiny region of the entire universe – an area producing a dark matter halo the size of the Milky Way, Stadel says. The simulation must take into account gravitational fields from around the universe but focuses most of its computing power on the halo itself, Stadel says.

The net result is a simulation that starts with about 10 billion dark-matter-halo particles, adds another 20 billion in the area immediately outside the halo and then throws in about 15 billion more massive, lower-resolution particles. Tracking such a huge number of particles as they interact over billions of years obviously is an enormous computational challenge.

Computing initial conditions, Stadel says, taxed the available computing power and memory and slowed communications between processors. “All of these problems were present to a lesser extent in earlier simulations, but were just below the level of being recognized as potentially serious problems for a simulation with 10 times the resolution.” The researchers have resolved the problems, however, and have run a series of lower-resolution simulations. They’re checking to see if they’re correct when evolved from the early universe to the present, Stadel says.

Many of the INCITE researchers also collaborated with the Zurich group on GHALO, a galactic dark matter simulation that tracked 2.1 billion particles of 1,000 solar masses each. Other researchers on the project include Ben Moore, director of the Institute for Theoretical Physics, and Marcel Zemp of the University of Michigan.

Diemand, Moore, Stadel, Potter and other researchers on the team specialize in computational astrophysics and related areas. Still others focus on astrophysics, astronomy and cosmology. The group also has collaborated with researchers working in particle physics and other subjects.

It’s invigorating, Kuhlen says. “We find ourselves at an intersection of a lot of different fields,” with experiments in gamma ray and particle detection and bigger and better computer simulations reaching fruition. “It’s a really interesting time because there are advances in physics and computers all coming together.”

Page: 1 2 3

Thomas R. O'Donnell

The author is a former Krell Institute science writer.

Share
Published by
Thomas R. O'Donnell

Recent Posts

Simulation looped in

CogSim, a machine-learning approach modeled on the brain, coordinates simulations, experiments and data-analysis to yield… Read More

December, 2022

Justice adjustments

An Oak Ridge team applies supercomputing to environmental justice in a changing climate. Read More

November, 2022

Decoding disease

A CSGF fellow doggedly applies computational models to COVID-19 and cancer. Read More

October, 2022

River of data

Oak Ridge-developed open-source software is helping model watersheds and related climate issues the world over Read More

September, 2022

Boxing in software

At Los Alamos, novel container software is accelerating innovative science while ensuring supercomputer reliability and… Read More

August, 2022

Fast fade

A DOE CSGF recipient studies transients, celestial objects that appear suddenly and rapidly fade. Read More

February, 2022