Categories: Oak Ridge

Seeing the invisible

This frame from the Via Lactea II visualization shows the dark matter halo as it might look today, more than 13.7 billion years since the Big Bang. Gravity has drawn the partcles into dense clumps, which retain much of their stucture as they are pulled toward the halo’s center. The color scale shows dark matter density increasing from blue to white.

The model indicates that hundreds of concentrated dark matter clumps survive even near our solar system. Earlier simulations with coarser resolution – larger representative particles – portrayed dark matter in this area as smooth.

Via Lactea and other simulations are helping make the case for the cold dark matter model, Kuhlen says. If dark matter was hotter, its thermal velocity would make for a smooth structure, rather than clumpy. Apart from that, Via Lactea says little about just what kind of particles comprise dark matter, because it’s designed to resolve dark matter’s gravitational effects.

“We’re concerned with how dark matter is laid out on a larger scale than one in which individual particle physics would matter. At the same time, it doesn’t mean (the simulation is) irrelevant for experiments that attempt to directly detect dark matter particles.”

In fact, the group’s findings could help researchers understand experiments that attempt to directly detect rare collisions between dark matter particles and heavy atomic nuclei, like iodine, xenon or tungsten.

Via Lactea II produced a huge amount of data: about 20 terabytes (20 trillion bytes) – enough to fill about 100 average-sized personal computer hard drives. That’s tiny, however, compared to the 500 terabytes expected to come out of the next version, tentatively named Silver River, the East Asian nickname for the Milky Way.

Instead of dark matter particles about 4,000 solar masses big, researchers plan to track particles just 100 solar masses in size, Kuhlen says. “It’s like skipping a generation. It’s also why it’s causing us trouble. It’s a very big jump.”

The researchers have struggled with establishing the simulation’s initial conditions – the parameters for its starting point. The obstacle kept the group from using the bulk of a 2009 INCITE grant of 5 million processor-hours on Jaguar. The grant has been extended through 2010.

Led by Jürg Diemand, Joachim Stadel and Potter, the team is continuing to develop PKDGRAV2 and tackle the initial conditions problem. “The sheer size and high resolution of the initial conditions presented us with several new problems,” Stadel says.

It’s tricky, because the researchers must zoom in on just a tiny region of the entire universe – an area producing a dark matter halo the size of the Milky Way, Stadel says. The simulation must take into account gravitational fields from around the universe but focuses most of its computing power on the halo itself, Stadel says.

The net result is a simulation that starts with about 10 billion dark-matter-halo particles, adds another 20 billion in the area immediately outside the halo and then throws in about 15 billion more massive, lower-resolution particles. Tracking such a huge number of particles as they interact over billions of years obviously is an enormous computational challenge.

Computing initial conditions, Stadel says, taxed the available computing power and memory and slowed communications between processors. “All of these problems were present to a lesser extent in earlier simulations, but were just below the level of being recognized as potentially serious problems for a simulation with 10 times the resolution.” The researchers have resolved the problems, however, and have run a series of lower-resolution simulations. They’re checking to see if they’re correct when evolved from the early universe to the present, Stadel says.

Many of the INCITE researchers also collaborated with the Zurich group on GHALO, a galactic dark matter simulation that tracked 2.1 billion particles of 1,000 solar masses each. Other researchers on the project include Ben Moore, director of the Institute for Theoretical Physics, and Marcel Zemp of the University of Michigan.

Diemand, Moore, Stadel, Potter and other researchers on the team specialize in computational astrophysics and related areas. Still others focus on astrophysics, astronomy and cosmology. The group also has collaborated with researchers working in particle physics and other subjects.

It’s invigorating, Kuhlen says. “We find ourselves at an intersection of a lot of different fields,” with experiments in gamma ray and particle detection and bigger and better computer simulations reaching fruition. “It’s a really interesting time because there are advances in physics and computers all coming together.”

Page: 1 2 3

Thomas R. O'Donnell

The author is a former Krell Institute science writer.

Share
Published by
Thomas R. O'Donnell

Recent Posts

Tracking space debris

 A Computational Science Graduate Fellowship recipient monitors threats to satellites in Earth’s ionosphere by modeling… Read More

February, 2024

Decisive achievement

A computational sciences fellow models COVID-19 virus variants and examines how people weigh complex decisions. Read More

October, 2023

Learning climate

A Colorado State fellow employs machine learning for climate modeling, putting provenance behind predictions. Read More

August, 2023

Statistically significant

A LANL statistician helps cosmologists and epidemiologists grasp their data and answer vital questions. Read More

July, 2023

Cutting carbon, blocking blooms

Besides bioplastics research, the LANL Biofuels and Bioproducts team is studying carbon neutrality and applying… Read More

June, 2023

Planet-friendly plastics

A Los Alamos team applies machine learning to find environmentally benign plastics. Read More

June, 2023