Jacobo Bielak (top left), David O'Hallaron (top right), Ricardo Taborda (bottom left), Leonardo Ramírez-Guzmán (bottom right).
The earth shook on the day Jacobo Bielak was born and — whether you think of it as serendipity or random chance — earthquakes became his career. “It was a small one,” says Bielak. In his home town of Mexico City, quakes are common due to the top layer of soft soil. “It’s like a bowl of jello. The waves amplify. You develop an inner ear that tells you, ‘Oh, this is a minor one,’ and then once in awhile you say, ‘Oh-oh, this one’s for real.’ ”
The “for real” ones have occupied Bielak’s workdays for many years. A professor of civil and environmental engineering at Carnegie Mellon University, Bielak and his Carnegie Mellon colleague, computer scientist David O’Hallaron, lead the Quake Group, one of the world’s leading efforts at developing computational strategies to realistically simulate the soil vibrations that occur during earthquakes. Early this year, Bielak won a four-year, $1.6-million grant from the National Science Foundation to support work on simulations with colleagues from the University of California that will help predict seismic risks affecting the Los Angeles basin and other earthquake-prone urban areas.
Working in close collaboration with the Southern California Earthquake Center (SCEC), an inter-disciplinary community of hundreds of scientists worldwide, the Quake Group’s guiding objective is “hazard analysis” — to understand the probability region-by-region within an earthquake basin that a certain level of ground-motion will occur. “The severity of shaking,” says Bielak, “can vary significantly within relatively small areas — depending on geological characteristics. Our models can predict the ground motion of a prescribed quake, and engineers need this information to define building codes that provide for the safest possible structures at reasonable cost.”
In 2006, the Quake team — in a project led by former grad student Tiankai Tu — won the HPC Analytics Challenge Award at SC06 in Tampa for “Hercules,” their software that coordinates all the stages of large-scale earthquake simulation, from problem definition to final visualization. Using PSC-developed software called PDIO, Hercules can visualize results in real time as the simulation is running. With this unified framework, all tasks — building a software mesh that subdivides the quake volume, partitioning the job among hundreds or thousands of processors, the simulation itself, and visualizing results — are performed in place on the computing platform.
In recent Quake Group work, Ph.D. students Leonardo Ramirez and Ricardo Taborda (with consulting from PSC scientist John Urbanic), ran Hercules on BigBen, PSC’s Cray XT3, to simulate a major Southern California earthquake scenario called ShakeOut. Their simulation of this potential magnitude 7.8 quake shows how the largest ground motions occur within sedimentary valleys, where waves are trapped within the basin and amplified by the soft soil deposits.
Hercules simulations of ShakeOut, furthermore, have been part of an important cross-validation study with two other SCEC simulation groups. After three years, the results of this work — which rely partly on data-analysis routines developed by PSC scientist Joel Welling — show that the three schemes are consistent and accurate enough to rely upon for future work. “We’ve come a long way from where we started,” says Bielak, “and this is an important point from which we can go forward with confidence in the validity of our simulations.”
The red box shows the area of the ShakeOut quake scenario, with the dashed line representing the portion of the San Andreas fault where slip occurs. A snapshot from the simulation (top right) shows distribution of the horizontal ground velocity through the epicentral region 60 seconds after onset of the quake. The final graphic (bottom right) shows the distribution of maximum ground-velocity from the ShakeOut quake, with an inset showing the Los Angeles region. The largest motion (yellow to white) occurs within sedimentary valleys where waves are trapped and amplified.
The ShakeOut quake scenario — a project that involves more than 300 professionals — was defined by the U.S. Geological Survey and SCEC as a magnitude 7.8 quake set off by a rupture along more than 250 kilometers of the San Andreas fault. Scientists believe that such a quake — large enough to cause strong shaking over much of southern California — is inevitable. Estimates are that it will cause 2,000 deaths, 50,000 injuries, $200-billion in damage and other losses and long-lasting disruption.
Understanding the potential of these impacts is an important step in preparing for the event. In November 2008, the ShakeOut scenario will be the centerpiece of emergency-response and public-preparedness exercises in Southern California involving three-million people.
The ShakeOut volume domain — 600 kilometers long by 300 kilometers wide by 84 kilometers deep — encompasses the most prominent fault structures in the region, and includes all major cities in the Los Angeles basin. Simulations provide details of the ground motion, including variations within cities and regions, that help to identify a range of effects from direct physical impacts to long-term consequences, and assist in response planning.
At a February 2008 meeting of the Earthquake Engineering Research Institute (EERI), the Quake Group presented results from a ShakeOut simulation — using 1,024 BigBen processors — in which they modeled ground motion up to a frequency of 0.5 vibrations per second (Hz). Unlike other Shakeout simulations, Hercules employs a “meshing” method — developed and parallelized by O'Hallaron — that tailors the size of subvolumes according to soil stiffness. The advantage of this “adaptive mesh” is that, for a given frequency of vibration, wavelengths are shorter in softer soils, and Hercules adjusts the mesh-size to finer resolution that can accurately capture these shorter wavelengths. Hercules’ meshing algorithm is highly efficient, and required only 70 seconds on BigBen to build a mesh of 81.5 million elements.
The Quake Group’s video of this simulation, which won EERI’s first annual graphics competition, shows distribution of peak motions, both for the entire region and within smaller regions — information of interest to engineers, who want to know where the largest ground-motions occur. “We show,” says Bielak, “that away from the immediate epicentral region, the largest ground-motion occurs within the sedimentary valleys, where waves are trapped and amplified by the soft soil deposits.”
An important goal of the Quake Group’s work with the ShakeOut scenario has been to compare results with two other SCEC groups who also did ShakeOut simulations, both of which used an approach (finite difference) that offers tradeoffs to Hercules (finite element). All three groups simulated the same scenario using TeraGrid computing resources. While the Quake team used BigBen, a group from San Diego State University computed at SDSC and a group from URS Corporation used resources at TACC.
Comparing Between ShakeOuts
The top two planes of this display, developed by PSC scientist Joel Welling, show a “phase mismatch” and an “envelope mismatch” between two simulations, while the third plane represents a difference between the first two planes.
Early on in this work, which proceeded over several years, it was clear that direct comparisons would be difficult due to the large spatial grids and many time steps involved, and because of differences in the simulation algorithms. Wave fronts from the same initializing data, for instance, can propagate at slightly different rates.
To help with this effort, PSC scientist Joel Welling developed a computational routine that statistically compares overall results between two different simulations. The concept is drawn from other research (Kristeková et al.) adapted by Welling to the ShakeOut data. The routine is concerned with two wave-related variables of phase and amplitude (envelope) and provides a graphical representation of mismatches between two simulation outputs. “This provides an outstanding means,” says Bielak, “for comparing between different simulation techniques, and greatly facilitated our verification efforts.”
Similar statistical analysis is important to challenges posed by the huge datasets that will be produced by petascale computing. For the Quake Group, future work — some of it to be done with TeraGrid resources at TACC and NICS — aims at ShakeOut simulations from 1.5 Hz up to 3.0 Hz. These will be the highest resolution quake simulations ever done, entailing datasets in the range of 500 terabytes. To analyze such data-intensive results, PSC is developing the capability — with help from a large shared-memory system expected to be operational in 2010 — to transpose the data from its time-varying orientation and represent it spatially. This powerful way of seeing earthquake simulation results will allow researchers to focus on specific locales, such as where ground motion is most intensive.