NEW LIGHT ON DARK MATTER

They have all but pulled the plug on the ailing standard cold dark matter model.

Testing Cosmic Theory

Not many years ago, cosmology was more theology than hard science, says Princeton astrophysicist Jeremiah P. Ostriker. "Supercomputers have helped change that," he adds. Indeed, supercomputing is proving itself one of the most useful means of getting a handle on the nature and structure of the universe.

Ostriker, along with scientists Guohong Xu and Renyue Cen, is using Pittsburgh Supercomputig Center's CRAY T3D to test competing theoretical models of cosmological origin, each of which offers a blueprint for the embryonic universe. One of these, the so-called standard cold dark matter model, had until recently been one of the most widely held theories. While calculations of the other three theories are still being analyzed, the Ostriker team's work has all but pulled the plug on the ailing standard cold dark matter model.

PSC scientific visualization specialist Joel Welling led a GC3 collaborative effort that tapped the resources of three NSF centers to simulate the Andromeda and Milky Way galaxies colliding, an event predicted to occur some five billion years from now. Presented in a virtual reality medium, the model thrusts observers into the thick of the intergalactic squall. This project was recognized as the Best Integration of Heterogeneous Applications at Supercomputing '95, the annual conference that showcases high-performance computing and communication.
This project is one of the largest supercomputing efforts ever undertaken and occurs under the auspices of the Grand Challenge Cosmology Consortium (GC3), a collaborative effort among physicists and computer scientists at six universities and research centers. Begun in 1993 with funding from NSF and led by Ostriker, GC3 aims to exploit scalable parallel systems like the CRAY T3D to investigate the universe.

"What we've done that's new," says Ostriker, "is take some of these speculative scenarios, put them in a computer, and see what they predict in terms of gravitational lensing. Nobody had ever done that before because it's an incredibly hard computational problem."

The Original Space-Based Telescope

Gravitational lenses are caused by massive, compact clusters of galactic matter. Much as an optical lens bends light to form an image, the enormous gravitational field of a cluster magnifies and distorts light coming from objects lying far beyond it, such as quasars and galaxies, sometimes producing multiple images that flank the lens itself. It's a valuable, naturally occurring observational tool whose imaging potential also plays a key role in the Princeton simulations.

That lenses exist at all is due in part to dark matter, the invisible glue of the universe. To maintain the structure of the universe, to hold the galaxies together as galaxies, for instance, requires much more gravity than observable matter can generate. Calculations indicate, in fact, that dark matter, the unobservable part of the universe, accounts for as much as 90 percent of all matter. Whether dark matter consists of as yet undetected galactic particles, massive but hidden stellar objects, or some combination of the two, remains undetermined.

But, says Xu, "It doesn't matter what dark matter is made of. The lensing phenomenon detects the gravitational effect of the dark matter. And that's what counts when you're modeling a piece of the universe."

Nailing the Coffin of Cold Dark Matter

A model spreads matter around much the way it currently exists in the cosmos -- thick lumpy patches of galaxies and galaxy clusters separated by vast comparatively empty stretches of cosmological range. Density distribution is key, and the number of gravitational lenses occurring in a particular patch of space becomes the test of that distribution. An accurate model will exhibit a distribution of lenses that coincides with observed data. "Different theories predict more or fewer gravitational lenses," says Ostriker, "and that's what we're computing for."

The standard cold dark matter theory, for instance, which calculates just enough mass to prevent eventual collapse or eternal expansion of the universe, produced many more lensed objects of greater intensity than is observed in actuality. Another model, which says the universe will eternally expand because not enough matter exists to prevent it, produced results more in line with today's universe. "The standard cold dark matter model is wrong," says Ostriker. "So the question is, are any of its variants plausible? For now, supercomputers such as the T3D are the primary tools that can help answer such questions."

These two images represent a 3D universe simulated from standard cold dark matter parameters. Density varies from blue (low) through green, red and yellow (high). To gauge the probability of gravitational lensing, the model generates magnification maps (top). The denser the region, the greater the magnification power. Vibrant yellow indicates the most likely locations that otherwise undetectable objects, such as quasars and galaxies, would be significantly magnified and imaged multiple times. The second image shows gravitational lensing at work. The large yellow clump with red and green borders (top right) depicts the lens. The two small elongated objects closely flanking it represent lensed images. This model predicted many more gravitational lenses than are actually observed in a given unit of space, in effect ruling out the standard cold dark matter model.

The Universe and the T3D

Because of limited computing capability, a prior modeling effort by the Ostriker group melded low resolution universe models with high resolution models, producing a portion of the universe that was quite small in cosmological terms (five megaparsecs). Though it offered a good approximation of the interactions between objects near one another, it didn't deal well with remote but interacting galaxies, clusters and other stellar bodies. Such interactions play a key role in the physics of the evolving universe.

The T3D eliminated the need to meld data, boosting the integrity of the results. It generated models 20 times larger (100 megaparsecs) without sacrificing the resolution necessary for small-scale accuracy, and accounted for interactions between remote but gravitationally related objects. "We still get the same resolution achieved with the much smaller model," says Xu. The T3D's memory and data capacities, adds Xu, played a critical role in the project's success. Each of the four simulations generated about 50 gigabytes of data, averaged 40,000 processing hours, and ran mostly using 256 processors, with minimum and maximum runs using 128 and 512. "We ran the biggest simulation of this type ever attempted," says Ostriker. "It could not have been done without the T3D."



Researchers: Jeremiah P. Ostriker, Princeton University.
Hardware: CRAY T3D
Software: User developed code.
Keywords: astrophysics, cosmology, cosmological origin, standard cold dark matter model, Grand Challenge Cosmology Consortium, GC3, gravitational lensing, galactic matter, megaparsecs.

Related Material on the Web:
GC3 Home Page
Mysterious Dark Matter from NCSA's Cosmos in a Computer online exhibit.
Joel Welling's Home Page
Projects in Scientific Computing. PSC's annual research report.

References, Acknowledgements & Credits