Ostriker, along with scientists Guohong Xu and Renyue Cen, is using Pittsburgh Supercomputig Center's CRAY T3D to test competing theoretical models of cosmological origin, each of which offers a blueprint for the embryonic universe. One of these, the so-called standard cold dark matter model, had until recently been one of the most widely held theories. While calculations of the other three theories are still being analyzed, the Ostriker team's work has all but pulled the plug on the ailing standard cold dark matter model.
|PSC scientific visualization specialist Joel Welling led a GC3 collaborative effort that tapped the resources of three NSF centers to simulate the Andromeda and Milky Way galaxies colliding, an event predicted to occur some five billion years from now. Presented in a virtual reality medium, the model thrusts observers into the thick of the intergalactic squall. This project was recognized as the Best Integration of Heterogeneous Applications at Supercomputing '95, the annual conference that showcases high-performance computing and communication.|
"What we've done that's new," says Ostriker, "is take some of these speculative scenarios, put them in a computer, and see what they predict in terms of gravitational lensing. Nobody had ever done that before because it's an incredibly hard computational problem."
That lenses exist at all is due in part to dark matter, the invisible glue of the universe. To maintain the structure of the universe, to hold the galaxies together as galaxies, for instance, requires much more gravity than observable matter can generate. Calculations indicate, in fact, that dark matter, the unobservable part of the universe, accounts for as much as 90 percent of all matter. Whether dark matter consists of as yet undetected galactic particles, massive but hidden stellar objects, or some combination of the two, remains undetermined.
But, says Xu, "It doesn't matter what dark matter is made of. The lensing phenomenon detects the gravitational effect of the dark matter. And that's what counts when you're modeling a piece of the universe."
The standard cold dark matter theory, for instance, which calculates just enough mass to prevent eventual collapse or eternal expansion of the universe, produced many more lensed objects of greater intensity than is observed in actuality. Another model, which says the universe will eternally expand because not enough matter exists to prevent it, produced results more in line with today's universe. "The standard cold dark matter model is wrong," says Ostriker. "So the question is, are any of its variants plausible? For now, supercomputers such as the T3D are the primary tools that can help answer such questions."
|These two images represent a 3D universe simulated from standard cold dark matter parameters. Density varies from blue (low) through green, red and yellow (high). To gauge the probability of gravitational lensing, the model generates magnification maps (top). The denser the region, the greater the magnification power. Vibrant yellow indicates the most likely locations that otherwise undetectable objects, such as quasars and galaxies, would be significantly magnified and imaged multiple times. The second image shows gravitational lensing at work. The large yellow clump with red and green borders (top right) depicts the lens. The two small elongated objects closely flanking it represent lensed images. This model predicted many more gravitational lenses than are actually observed in a given unit of space, in effect ruling out the standard cold dark matter model.|
The T3D eliminated the need to meld data, boosting the integrity of the results. It generated models 20 times larger (100 megaparsecs) without sacrificing the resolution necessary for small-scale accuracy, and accounted for interactions between remote but gravitationally related objects. "We still get the same resolution achieved with the much smaller model," says Xu. The T3D's memory and data capacities, adds Xu, played a critical role in the project's success. Each of the four simulations generated about 50 gigabytes of data, averaged 40,000 processing hours, and ran mostly using 256 processors, with minimum and maximum runs using 128 and 512. "We ran the biggest simulation of this type ever attempted," says Ostriker. "It could not have been done without the T3D."
References, Acknowledgements & Credits