Projects in Scientific Computing: At the Frontier of Physics and Chemistry

  A Taste of Quark Soup
Simulated heavy-ion
collisions test
the limits of
data acquisition.

To recreate in a contained space conditions like those during the first microsecond after the birth of the universe — that's what physicists are up to on Long Island as the millenium winds down. The Relativistic Heavy Ion Collider, RHIC, a $600 million project at Brookhaven National Laboratory, under construction since 1991, is now finishing tests in preparation for the first actual physics runs, scheduled for January 2000.

The calendar gives a propitious gloss to the timing, and the potential for new understanding at the most fundamental levels of physics has scientists around the world anticipating the possibilities. "To actually be on the scene, to feel the excitement, see the signals coming in. It's hard to describe," says Mort Kaplan, professor of chemistry at Carnegie Mellon. "There's a tingling in the air. We're going to see phenomena we've never seen before."

Kaplan is a founding member of the STAR collaboration, an international collaboration of 37 institutions and more than 400 researchers worldwide. STAR (Solenoidal Tracker at RHIC), one of two large detectors in the new collider, will track the profusion of particles liberated from nuclei when heavy ions smash into each other at near light speed. STAR scientists will gather the data and put their heads together to figure out what it means.

Beginning in summer 1998, Kaplan and his colleague Dan Russ collaborated with PSC scientists to conduct two rounds of STAR "mock data challenges," a series of simulated collider events intended to help the researchers and their processing systems get ready for the real thing. The MDCs relied on PSC's CRAY T3E and also used a T3E at NERSC (National Energy Research Scientific Computing Center) in California. PSC staff worked diligently to revise the simulation software, called GSTAR, to run on the T3E. "This was a huge task," says PSC scientist Nick Nystrom. "It's a Monte Carlo simulation package, and each collision event is enormously complex."

Altogether, two rounds of MDC produced nearly 300,000 simulated heavy-ion collisions, captured as 3.9 trillion bytes (terabytes) of data, which gives an idea of the magnitude of the computing challenge associated with STAR. Run-throughs of the data storage and analysis processes, says Kaplan, led to software changes that speed-up the process and reduce the amount of memory required. "The availability of extensive PSC resources — not only the T3E, but also the massive data-storage capability and consulting staff close at hand — was crucial to the success of this very large-scale series of simulations."

When Heavy Ions Collide

The "HI" in RHIC means heavy ion, and that's the key to this collider for the new millenium. Other particle accelerators, CERN in Switzerland and Fermilab in Illinois, also investigate very high-energy states of matter, but for the most part do it with beams of protons or light ions. CERN has a heavy-ion program — a lead beam smashing into a stationary target, but RHIC will be the first facility to smash heavy ions into each other. By comparison, prior collision experiments are relatively tame.

Many RHIC experiments will use gold, a "heavy element" — with a massive nucleus of 79 protons and 118 neutrons. Strip away the electrons and you have a heavy ion. RHIC will accelerate heavy ions around two nearly circular 2.4-mile tracks in opposite directions to reach 99.99 percent the speed of light. The collisions will create not only extremely high temperatures, like other accelerators, but also much higher nuclear densities in the overlap region, a significant distinction.

Because they're traveling at virtually the speed of light, the normally spherical nuclei will flatten like pancakes before they collide. The colliding nuclear particles pass through each other for an instant before they explode, and this ultra-high density state will be equivalent to turning up the heat to ten trillion degrees Kelvin, 10,000 times hotter than the center of the sun. The resulting blast of particles, to put it nicely, is messy. A single head-on gold-gold collision is likely to produce 5,000 to 10,000 individual subatomic particles shooting off in every possible direction.

Somewhere in that instantaneous chaos may exist, for a hundred trillionths of a trillionth of a second, a state of matter that mirrors the universe at the instant after the big bang. At these extreme conditions, the most fundamental particles — quarks — are expected to be released from the gluons that bind them into neutrons and protons. That's what the physicists are looking for — a state of matter never before observed called quark-gluon plasma, known less technically as quark soup.

If quark soup is there, it will be up to STAR to detect it. The $60 million STAR detector is a 1,200 ton instrument about the size of a house. At its heart is the Time Projection Chamber, essentially 140,000 wires that operate like a 3D digital camera. Flying particles induce current in the wires, which allows the camera to record the thousands of particles released at the instant of a heavy-ion collision. If quark soup is there, this particle data will enable scientists to sniff it out.

Calibrating the STAR Trigger

  Images from simulation of a gold-gold headon collision. The STAR detector is shown in outline. The same event is also shown closeup. Colors represent different types of charged particles, with about 20 represented.
Download larger version of the simulation (209K) and inset (742K).

To detect quark soup presents a supreme challenge in data acquisition and processing as well as physics. "There will be up to 1,000 collisions per second," says Kaplan. "And the fastest you can write to magnetic tape is one collision per second." Each event yields about 20 megabytes of data, about the amount of data that can be recorded in a second.

"Most of the what the detector will see isn't interesting," explains Kaplan. To sort through the overabundance of particle data and identify events that may yield new knowledge involves applying theory-based physics models of what will happen. "Fast decisions have to be made online, simple measurements that allow you to decide whether to start processing, or to eliminate an event and go on to the next one."

This process, controlled through a series of software pipelines, is called the STAR trigger. The question is how to decide in real time what data to collect and what to discard, and the mock data challenges were crucial, notes Kaplan, in setting up algorithms to do this. Kaplan cautions, however, that the real test is yet to come. No one knows with certainty what will happen when RHIC flings gold ions at each other for the first time.

The theoretical models, emphasizes Kaplan, are based on long extrapolations from known results. "Once you start getting real data, we'll be able to see where we're wrong." For this reason, computational simulations will continue to be crucial to STAR's search for quark soup. A large part of the analysis will be to carry out new simulations to understand the data.

"When you look at the experimental data," says Kaplan, "you see tracks or numbers or characteristics of observables. From this you have to infer the physics. The models work the other way around, starting from the physics you do simulations to produce tracks and observables. Getting the physics out of the data is always partly interpretive, and you need the simulations to go hand-in hand with data."

Overall, RHIC is expected to produce a million-billion bytes (a petabyte) of data a year for ten years. Part of the purpose of the MDCs was to pipe high volumes of simulated data through the computational and storage facilities at RHIC to test their capability to keep up with real-time data.

"The second MDC," says Russ, "was especially useful to test the algorithms that will analyze the data as it comes off, and it also exercised the high-performance storage facility at RHIC. We used 64 processors of the PSC T3E, and had excellent PSC staff support. The availability of PSC was crucial. This would have taken forever on work stations."

Researchers Mort Kaplan, Carnegie Mellon University
Nick Nystrom, Pittsburgh Supercomputing Center
Sergiu Sanielevici, Pittsburgh Supercomputing Center
Raghu Reddy, Pittsburgh Supercomputing Center
Hardware CRAY T3E.
Software GSTAR.
Related Material
on the Web
STAR Home Page
The Relativistic Heavy Ion Collider at Brookhaven National Laboratory
Writing: Michael Schneider
HTML Layout/Coding: R. Sean Fulton

© Pittsburgh Supercomputing Center (PSC), Revised: Oct. 15, 1999
Search :