With MassiveBlack, the largest cosmological simulation of its kind to date, and a new approach to visualizing the results, enabled by PSC’s Blacklight, astrophysicists solved a puzzle about how some of the first black holes in the universe became supermassive in such a short time

Blue Skate

[from left to right] Tiziana Di Matteo, Rupert Croft, Yu Feng & Nishikanta Khandai, Carnegie Mellon University

“How did we get these huge monsters so early on?” asks Tiziana Di Matteo. The Carnegie Mellon astrophysicist is referring to supermassive black holes, which — astrophysicists now know — reside at the center of every large galaxy. These cosmic behemoths, with mass that can be many billion times that of the sun, swallow huge quantities of gas and form the gravitational cores that have structured matter into galaxies. Although themselves invisible, they signal their presence by the quasars they spawn, as inward-drawn gas heats and radiates light that can be a hundred trillion times brighter than the sun.

Di Matteo’s question responds to recent astronomical observations, such as the Sloan Digital Sky Survey, that have discovered quasars associated with supermassive black holes in the first billion years after the big bang. The existence of black holes per se isn’t surprising, but supermassive ones at these infant stages of the universe, currently 13.6 billion years old, present a challenge for the reigning “cold dark matter” model of how the universe evolved. “If you write the equations for how galaxies and black holes form,” says Rupert Croft, Di Matteo’s Carnegie Mellon colleague, “it doesn’t seem possible that these huge masses could form that early.”

To resolve this puzzle, Di Matteo, Croft and their collaborators turned to supercomputing. “Even before the recent quasar observations,” says Di Matteo, “it’s been a pressing question. When and how did the first black holes form?” To get some answers, Di Matteo and Croft and the rest of their group in 2010 mounted a very largescale simulation with Kraken, a Cray XT5 XSEDE resource at the University of Tennessee (NICS). They called their simulation MassiveBlack.

Using all of Kraken’s compute cores, nearly 100,000, with post-doctoral fellow Nishikanta Khandai doing most of the hands-on computing, the researchers simulated a huge volume of space (a cube of .75 gigaparsecs per side, about 2.5 billion light years). Within this volume, MassiveBlack included 65.5 billion particles to represent matter, as the universe evolved from 10 million years after the big bang through the period of early structure formation and emergence of the first galaxies and quasars to when it was 1.3 billion years old.

The largest cosmological simulation of its kind (smooth particle hydrodynamic) to date, MassiveBlack produced a massive amount of data, for which the researchers turned to another XSEDE resource, Blacklight at PSC, the world’s largest shared-memory computing system. Blacklight made it possible to hold a snapshot of the entire simulation volume, three terabytes of data, in memory at one time.

With help from this new approach to visualization, the researchers identified a physical phenomenon that goes far toward explaining the existence of supermassive black holes so early in the universe.

Massive Black

GIGAPAN VIEW OF MASSIVE BLACK These images represent a screen grab from the GigaPan viewer. The background (above) shows the entire simulation volume at redshift 4.75, about 1.3 billion years after the big bang. Respective zooms lead to a region that contains one of the most massive black holes, shown in the final zoom. Color indicates density (increasing from red to yellow to blue to green).

“We were able to show,” says Di Matteo, “that in regions of high density the gas comes straight into the center of the black hole, extremely fast, and in these places we see that the black holes grow really, really quickly.” The researchers call this phenomenon “cold gas flows.” It had been seen in other simulations and had been gaining acceptance as a phenomenon involved in galaxy formation, but only at much lower redshifts, when the universe is older, not within the first billion years. “This was the first simulation,” adds Di Matteo, “to see this at high redshift.”

Evolution of a Simulation

To run MassiveBlack, Di Matteo, Croft and colleagues used a cosmological simulation code (called P-GADGET) with proven ability to track the physics of black holes and quasars along with galaxy formation as the universe evolves. Di Matteo led a simulation five years earlier, a large run with GADGET on PSC’s then largest system, BigBen, that was the first simulation to include the physics of black holes and to run at sufficiently fine resolution to track their formation.

Using 2,000 of BigBen’s Cray XT3 processors in parallel over four weeks of run time, that simulation tracked a volume of cosmos (a cube of 33 megaparsecs) large for the time, but more than 1,000 times smaller than MassiveBlack. The findings — on the relationships between black holes and galaxy structure and the feedback process by which black holes eventually shut off their associated quasars — offered many new insights. “We’re still publishing papers from that simulation,” says Di Matteo. “It was very rich in science.”

MassiveBlack, however, metaphorically speaking, was another universe entirely, requiring much more extensive computing, a gigantic run. The starting requirement was a much larger volume of space, since supermassive black holes are very rare objects in the early universe. “If you looked at that volume in the present day,” says Croft, “there would be about a million Milky Way size galaxies. But in this early epoch, there would be only a handful of quasars with black holes of a billion solar masses.”

The availability of the powerful Kraken system was crucial, and Di Matteo, Croft and colleagues worked extensively for more than a year to include more physics with GADGET and to optimize it to run efficiently (to “scale”) on Kraken’s much larger processor count. PSC and XSEDE staff, including XSEDE extended collaborative support consultant Anirban Jana, helped with testing and benchmarking on various systems. The result with MassiveBlack, say the researchers, was a simulation high enough in resolution to follow how mass is distributed in the inner regions of galaxies, including how stars form and black holes grow as this huge volume of space evolves.

“It provides a unique framework to study the formation of the first quasars,” the researchers wrote in their paper (Astrophysical Journal Letters, submitted). They saw dense matter forming web-like filaments of structure, a phenomenon seen in the earlier PSC simulation on BigBen. But what the researchers also saw happening in the early universe with MassiveBlack — unavailable to discovery without large-scale simulation — was that these filaments essentially become pipelines of highly dense gas shooting directly to the center of black holes. “Very dense blobs of gas are going straight in,” says Di Matteo. “Glug.”

Blue Skate

FAST FOOD FOR THE MONSTER Three snapshots from MassiveBlack at three different redshifts (higher redshift represents earlier time) show evolution of a quasar associated with a supermassive black hole within the first billion years of the universe. Gas distribution is color coded by temperature (blue through red). Cold streams of gas (green) penetrate the dark matter “halo” (blue circle) of the black hole (green circle) at the galaxy center.

Seeing is Believing

With simulations such as MassiveBlack and others that produce enormous quantities of data, the ability to see simulation data in visual form is a big part of discovery. For MassiveBlack, graduate student Yu Feng, as part of Di Matteo and Croft’s team, created innovative visualization tools for which the shared memory of Blacklight, PSC’s newest system, was essential.

From the total of 36 snapshots of the simulation volume’s evolution, the researchers identified 10 that captured formation of the first quasars and black holes, with each snapshot comprising between three and four terabytes of data. Using PSC’s Lustre WAN storage system, the researchers transferred this data from Tennessee to Pittsburgh where they could work with it on Blacklight. Under other circumstances, it would take hours to read three terabytes of data from a hard drive, but with Blacklight’s ability to hold this much data, an entire snapshot at once in memory, Feng was able to “raster” each of the snapshots from simulation particles into pixels, so that the results were easily viewable.

“With an entire dataset in memory,” says Croft, “you can use colors to map properties, such as temperature and density, and you can click and zoom-in, move to a different area. Blacklight is the easiest machine to be able to do this. It allows the most transparent coding to manipulate these large datasets.”

To do this kind of viewing with this amount of data is revolutionary

For interactive viewing, Feng made one of the snapshots, 65 billion particles, more than a trillion pixels of imaged data, available for interactive viewing (and public access) through the GigaPan web interface: http://www.gigapan.org. Developed by Carnegie Mellon in collaboration with NASA Ames Intelligent Robotics Group, with support from Google, GigaPan provides interactive gigapixel viewing on the web. Also for GigaPan viewing, the researchers created an interactive zoomable animation of a smaller simulation — 1,000 frames representing the complete time evolution of early quasar and black hole formation.

These visualization tools, says Di Matteo, were vital to their “cold gas flow” findings, and to do this kind of viewing with this amount of data, she believes, is revolutionary: “You can pan through the entire volume. It’s all there. You can look at details, and you can change your mind and look somewhere else and compare. Before you would have to run code to do that. Here we saw this cold stream going in, and our reaction was ‘Wow.’”

© Pittsburgh Supercomputing Center, Carnegie Mellon University, University of Pittsburgh
300 S. Craig Street, Pittsburgh, PA 15213 Phone: 412.268.4960 Fax: 412.268.5832

This page last updated: May 18, 2012