Projects in Scientific Computing

Download the PDF version of this article as it appeared in Projects in Scientific Computing, 2003.


People in greater Los Angeles won’t soon forget January 17, 1994, 4:30 a.m. For eight seconds that may have seemed like eight hours, as if Mother Earth herself was restless in bed, nearly everyone across a 2,500 square-mile area jerked awake, adrenaline pumping.

Striking the densely populated San Fernando Valley of northern LA, the Northridge earthquake was the second time in 60 years that the earth ruptured directly beneath a major U.S. urban area. By the time the dust settled and officials counted the toll, 57 people were dead, more than 1,500 seriously injured. Collapsed freeways choked traffic for days. Over 12,000 buildings and 170 bridges sustained moderate to severe damage. Total economic loss was estimated at $20 billion.

Except for building codes that require earthquake-resistant structures, fatalities and damage would have been much worse. Still, the loss was enormous, and one of the main lessons of Northridge, as well as other urban earthquakes of recent decades, is the need for better information about where and how much the ground will shake.

(l to r) Omar
						Ghattas, Jacobo Bielak, David O'Hallaron.

(l to r) Omar Ghattas, Jacobo Bielak, David O'Hallaron.

“We’ve learned that the severity of ground shaking and consequent damage patterns vary significantly within relatively small areas,” says Jacobo Bielak, professor of civil and environmental engineering at Carnegie Mellon University. “Even from one block to the next, the level of shaking can change dramatically due to types of subsurface soil and rock and other geological characteristics and the nature of the seismic waves.”

Bielak and his Carnegie Mellon colleagues Omar Ghattas and David O’Hallaron lead the Quake Group, a large collaborative research team. Using sophisticated computational methods, they work to create realistic 3D models of earthquakes in geologically complex basins. This work, is supported by the National Science Foundation, and has been performed in collaboration with and additional support from the Southern California Earthquake Center (SCEC).Their objective is to provide accurate forecasts of earthquake ground motion as a necessary step toward creating building codes that provide for the safest possible structures at reasonable cost.

They have used LeMieux, PSC’s terascale system, to great advantage, taking big steps forward in their work. “We’ve benefited enormously from having this powerful system at PSC,” says Ghattas, “and we’ve developed algorithms that maximize our ability to use it well.”

Using as many as all 3,000 LeMieux processors at one time with high efficiency, they have carried out the most detailed, accurate simulations yet of the Northridge quake ” at twice the frequency of prior models. They’ve also made major inroads on an important problem called “the inverse problem,” the goal of which ” magical as it may seem ” is to determine subsurface geology by working backward from seismic measurements on the surface.

Wavelength Tailoring

Ancient advice says it’s better to build your house on rock than sand. If you live in an earthquake basin, that ancient advice, generally speaking, still holds.

From soft soils near the surface to hard rock deeper down and in the mountains, subsurface material varies tremendously in stiffness, the property that dictates seismic wavelength. For a given frequency, the softer the material, the shorter the length of the seismic waves, which means finer resolution, more computing, to accurately model the shaking.

“We’ve found,” says Bielak, “that even within a few hundred meters, the variability in soil properties — and therefore ground motion — can be very substantial. Because of this, similar buildings located near each other can experience significantly different levels of damage.”

Wavelength-Tailored Mesh.
Click Image to Enlarge

Wavelength-Tailored Mesh
This closeup shows how grid spacing is smaller in regions of slower seismic waves. Distance between points at the smallest elements is about 50 meters.

To accurately capture this wide range of ground vibration in a large earthquake basin like Los Angeles poses enormous challenges for earthquake modeling. One of the Quake Group’s key strategies has been to tailor their computational mesh - which divides the basin into millions of subvolumes - to soil stiffness.

They generate their LA Basin computational model from a geological model created at the SCEC. Where the SCEC model indicates softer soils, therefore shorter wavelengths, the mesh-generating software creates a denser mesh.

“By using disk space instead of computer memory,” says O’Hallaron, “our out-of-core algorithms can generate an extremely large mesh.” For the recent simulations, they represented the basin as 80 kilometers on each side by 30 kilometers deep. Within this volume, their irregular mesh maps more than 100 million subvolumes, making their computations with LeMieux the largest unstructured mesh simulations ever done.

“These are the most highly resolved LA Basin earthquake simulations to date,” says Ghattas, “and they are made possible by our adaptive meshes and their low memory requirements. To achieve similar accuracy with a uniform mesh would require 1,000 times more computing power.”

Realistic Frequencies & The Inverse Problem

Using 2,048 LeMieux processors, they simulated the Northridge quake for more than 30 seconds of shaking. Their “wave-propagation” software sustained exceptional performance - nearly a teraflop (a trillion calculations a second) over six hours of computing time. And it ran at nearly 90 percent parallel efficiency, a measure of how well the software uses many processors at the same time.

Ground-motion frequency is a key factor for building design, and this simulation accounted for shaking up to one vibration cycle per second (1 Hz) - double the previous high (.5 Hz). Earthquake modeling has been limited in its ability to simulate higher frequencies, from 1 to 5 Hz, that present the greatest danger to “low-rise” structures - which include most city buildings, predominantly two-to-five stories - because each doubling of frequency means a 16-fold increase in computing.

Snapshots of Propagating Waves.

t = 5.12 sec

Snapshots of Propagating Waves.

t = 12.8 sec

Snapshots of Propagating Waves.

t = 20.48 sec

Click Images to Enlarge

Snapshots of Propagating Waves
These images from the simulation show seismic waves propagating on the LA Basin surface, outward from the epicenter (white dot) and fault plane (rectangle) at three successive times, with color (blue to red) corresponding to frequency of vibration (0 to 1 Hz). "The directivity of the ground motion along strike from the epicenter," says Bielak, "and the concentration of motion near the fault corners follows a pattern observed during the actual earthquake."


“Our challenge is to attain realistic frequencies,” says Bielak. “Now, for the first time, we’re in the range that engineers need to know about. Typically, they want to see results up to 4 Hz, which points to the need for more computational power.”

The simulation reproduced ground motion of the Northridge quake more accurately than possible until now, but — not surprisingly — at some locations it failed to reproduce significant shaking. These discrepancies, notes Ghattas, are inevitable considering that the geological model is inherently incomplete. “Because of uncertainties in what we know about earthquake source and basin material properties, a critical challenge facing us is to obtain these properties by source inversion from records of past earthquakes.”

Solving the Inverse Problem.
Solving the Inverse Problem.
Click Images to Enlarge

Solving the Inverse Problem
As a first test for the extreme computational challenge of the inverse problem, the Quake Group chose a 2D shear-wave velocity distribution (top) in a 35 km x 20 km cross-section of the LA Basin model as a synthetic target. Starting with ground-motion measurements on the surface (64 points distributed evenly), the inverse algorithm (with a 257 x 257 grid) arrived at material properties for the cross-section that gave a velocity distribution (bottom) in close match to the target.

This problem - the inverse problem - is one of the important challenges of computational science and engineering, with potential applications in many fields, and it is key to the Quake Group’s goals. Ghattas and his former students Volkan Akcelik and George Biros won the best paper award last year at Supercomputing 2002 for their inverse wave-propagation algorithm that exploits parallel systems like LeMieux.

Using a sophisticated mathematical approach, their algorithm makes it possible to ascertain deep geological features based on seismic recordings on the surface. The deep geology is often not well understood and is known to play an important role in surface shaking. With LeMieux, for the first time, the Quake team solved a test case in two dimensions that proves the feasibility of this inverse approach.

“The inverse problem is orders of magnitude more difficult than the forward problem,” says Ghattas. “Large parallel systems and powerful algorithms are crucial.” One of the Quake Group’s near-future plans for LeMieux is to further test their inverse approach with the added difficulty of three dimensions.

Volkan Akcelik, Jacobo Bielak, George Biros, Ioannis Epanomeritakis, Antonio Fernandez, Omar Ghattas, David O'Hallaron, Eui Joong Kim, Julio Lopez and Tiankai Tu, Carnegie Mellon University

Terascale Computing System

User-developed code.

Related Material on the Web:
The Quake Project at Carnegie Mellon University.
Getting Ready for the Big One, Projects in Scientific Computing, 1997.
Supercomputers let scientists break down problems in reverse for better quake models, Pittsburgh Post-Gazette.

V. Akcelik, J. Bielak, G. Biros, I. Epanomeritakis, A. Fernandez, O. Ghattas, E. J. Kim, J Lopez, D. O'Hallaron, T Tu & J. Urbanic, "High Resolution Forward and Inverse Earthquake Modeling on Terascale Computers," preprint (2003).

J. Xu, J. Bielak, O. Ghattas, and J. Wang, " Three-dimensional nonlinear seismic ground motion modeling in inelastic basins", Physics of the Earth and Planetary Interiors, 137 (1-4), 81-95 (2003).

Michael Schneider, Pittsburgh Supercomputing Center

Web Design:
Sean Fulton, Pittsburgh Supercomputing Center

Revised: November 10, 2003