INSIDE LEMIEUX, THE 1994 NORTHRIDGE EARTHQUAKE SHAKES ALMOST LIKE THE REAL THING.
People in greater Los Angeles won’t soon forget January 17, 1994, 4:30 a.m. For eight seconds that may have seemed like eight hours, as if Mother Earth herself was restless in bed, nearly everyone across a 2,500 square-mile area jerked awake, adrenaline pumping.
Striking the densely populated San Fernando Valley of northern LA, the Northridge earthquake was the second time in 60 years that the earth ruptured directly beneath a major U.S. urban area. By the time the dust settled and officials counted the toll, 57 people were dead, more than 1,500 seriously injured. Collapsed freeways choked traffic for days. Over 12,000 buildings and 170 bridges sustained moderate to severe damage. Total economic loss was estimated at $20 billion.
Except for building codes that require earthquake-resistant structures, fatalities and damage would have been much worse. Still, the loss was enormous, and one of the main lessons of Northridge, as well as other urban earthquakes of recent decades, is the need for better information about where and how much the ground will shake.
“We’ve learned that the severity of ground shaking and consequent damage patterns vary significantly within relatively small areas,” says Jacobo Bielak, professor of civil and environmental engineering at Carnegie Mellon University. “Even from one block to the next, the level of shaking can change dramatically due to types of subsurface soil and rock and other geological characteristics and the nature of the seismic waves.”
Bielak and his Carnegie Mellon colleagues Omar Ghattas and David O’Hallaron lead the Quake Group, a large collaborative research team. Using sophisticated computational methods, they work to create realistic 3D models of earthquakes in geologically complex basins. This work, is supported by the National Science Foundation, and has been performed in collaboration with and additional support from the Southern California Earthquake Center (SCEC).Their objective is to provide accurate forecasts of earthquake ground motion as a necessary step toward creating building codes that provide for the safest possible structures at reasonable cost.
They have used LeMieux, PSC’s terascale system, to great advantage, taking big steps forward in their work. “We’ve benefited enormously from having this powerful system at PSC,” says Ghattas, “and we’ve developed algorithms that maximize our ability to use it well.”
Using as many as all 3,000 LeMieux processors at one time with high efficiency, they have carried out the most detailed, accurate simulations yet of the Northridge quake ” at twice the frequency of prior models. They’ve also made major inroads on an important problem called “the inverse problem,” the goal of which ” magical as it may seem ” is to determine subsurface geology by working backward from seismic measurements on the surface.
Ancient advice says it’s better to build your house on rock than sand. If you live in an earthquake basin, that ancient advice, generally speaking, still holds.
From soft soils near the surface to hard rock deeper down and in the mountains, subsurface material varies tremendously in stiffness, the property that dictates seismic wavelength. For a given frequency, the softer the material, the shorter the length of the seismic waves, which means finer resolution, more computing, to accurately model the shaking.
“We’ve found,” says Bielak, “that even within a few hundred meters, the variability in soil properties — and therefore ground motion — can be very substantial. Because of this, similar buildings located near each other can experience significantly different levels of damage.”
To accurately capture this wide range of ground vibration in a large earthquake basin like Los Angeles poses enormous challenges for earthquake modeling. One of the Quake Group’s key strategies has been to tailor their computational mesh - which divides the basin into millions of subvolumes - to soil stiffness.
They generate their LA Basin computational model from a geological model created at the SCEC. Where the SCEC model indicates softer soils, therefore shorter wavelengths, the mesh-generating software creates a denser mesh.
“By using disk space instead of computer memory,” says O’Hallaron, “our out-of-core algorithms can generate an extremely large mesh.” For the recent simulations, they represented the basin as 80 kilometers on each side by 30 kilometers deep. Within this volume, their irregular mesh maps more than 100 million subvolumes, making their computations with LeMieux the largest unstructured mesh simulations ever done.
“These are the most highly resolved LA Basin earthquake simulations to date,” says Ghattas, “and they are made possible by our adaptive meshes and their low memory requirements. To achieve similar accuracy with a uniform mesh would require 1,000 times more computing power.”
Realistic Frequencies & The Inverse Problem
Using 2,048 LeMieux processors, they simulated the Northridge quake for more than 30 seconds of shaking. Their “wave-propagation” software sustained exceptional performance - nearly a teraflop (a trillion calculations a second) over six hours of computing time. And it ran at nearly 90 percent parallel efficiency, a measure of how well the software uses many processors at the same time.
Ground-motion frequency is a key factor for building design, and this simulation accounted for shaking up to one vibration cycle per second (1 Hz) - double the previous high (.5 Hz). Earthquake modeling has been limited in its ability to simulate higher frequencies, from 1 to 5 Hz, that present the greatest danger to “low-rise” structures - which include most city buildings, predominantly two-to-five stories - because each doubling of frequency means a 16-fold increase in computing.
USING LEMIEUX, THE QUAKE GROUP HAS PROVEN THE FEASIBILITY OF SOLVING THE INVERSE PROBLEM, ONE OF THE IMPORTANT CHALLENGES OF COMPUTATIONAL SCIENCE AND ENGINEERING.
“Our challenge is to attain realistic frequencies,” says Bielak. “Now, for the first time, we’re in the range that engineers need to know about. Typically, they want to see results up to 4 Hz, which points to the need for more computational power.”
The simulation reproduced ground motion of the Northridge quake more accurately than possible until now, but — not surprisingly — at some locations it failed to reproduce significant shaking. These discrepancies, notes Ghattas, are inevitable considering that the geological model is inherently incomplete. “Because of uncertainties in what we know about earthquake source and basin material properties, a critical challenge facing us is to obtain these properties by source inversion from records of past earthquakes.”
This problem - the inverse problem - is one of the important challenges of computational science and engineering, with potential applications in many fields, and it is key to the Quake Group’s goals. Ghattas and his former students Volkan Akcelik and George Biros won the best paper award last year at Supercomputing 2002 for their inverse wave-propagation algorithm that exploits parallel systems like LeMieux.
Using a sophisticated mathematical approach, their algorithm makes it possible to ascertain deep geological features based on seismic recordings on the surface. The deep geology is often not well understood and is known to play an important role in surface shaking. With LeMieux, for the first time, the Quake team solved a test case in two dimensions that proves the feasibility of this inverse approach.
“The inverse problem is orders of magnitude more difficult than the forward problem,” says Ghattas. “Large parallel systems and powerful algorithms are crucial.” One of the Quake Group’s near-future plans for LeMieux is to further test their inverse approach with the added difficulty of three dimensions.
Related Material on the Web:
Revised: November 10, 2003
The Pittsburgh Supercomputing Center is a joint effort of Carnegie Mellon University and the University of Pittsburgh together with the Westinghouse Electric Company. It was established in 1986 and is supported by several federal agencies, the Commonwealth of Pennsylvania and private industry.
© Pittsburgh Supercomputing Center.