Part 1: A Problem Rears its Ugly Head

To solve some problems, it really does take a rocket scientist. And it doesn't hurt if a computing system like the CRAY T3D is available. Case in point: August 1995. The Delta II rocket puts a commercial satellite into the wrong orbit.

Designed and built to launch satellites for the U.S. Air Force, the Delta II has been a highly successful project. What happened? A committee of engineers from McDonnell Douglas and The Aerospace Corporation, the Air Force contractors on the Delta II, investigate. They find that a mechanical component malfunctioned, causing a lower than intended orbit. What caused the malfunction? Analysis leads them to suspect overheating due to "backflow" from the rocket engines.

How can the engineers confirm this analysis, and in a hurry? No one wants to trust an educated guess, no matter how sophisticated, when millions of dollars are on the line for upcoming satellite launches already scheduled. Wind tunnel testing? This tried-and-true design tool is almost useless for testing rocket performance with the "plume on" -- i.e., with the rocket engines firing, and the plume contributes dramatically to the aerodynamics of a rocket in supersonic flight. Furthermore, the Delta II is a clustered or "multi-body" rocket -- a core rocket surrounded by nine boosters. This by itself creates problems for wind-tunnel testing, which even in simple cases is costly and time-consuming. Is there a way out of this bind?

Part 2: The Caltech Group

In the late 1980s, Johnson Wang, a senior engineer at The Aerospace Corporation, developed a numerical scheme for simulating the aerodynamics of complex, multi-body rockets like the Delta II with high accuracy. In technical terms, his software is an efficient "flow solver" for the three-dimensional Navier-Stokes equations, the classic mathematical formulation of fluid flow.

To exploit the potential for improved performance on scalable, parallel systems, which team tens, hundreds or even thousands of processors to work simultaneously on a single computing task, Wang in 1991 joined forces with Stephen Taylor, head of the Scalable Concurrent Programming Laboratory at Caltech. Taylor works with a group of scientists who have pioneered basic programming technology to address important, large-scale problems in industrial and defense applications. Their work feeds directly into development of advanced software and the next generation of parallel machines.

Over the past four years, Taylor and Wang worked to implement their flow solver on a range of parallel computers, including work-station networks and shared memory parallel systems. Along with improved performance, the parallel implementation includes features that make it especially useful for rocket design, including the ability to simulate multi-body rockets.

In early 1995, Taylor and Wang put their software through its paces by simulating the Titan IV, another multi-body rocket. These runs proved that the parallel flow solver can provide the kind of data the Delta II team needs. "We can calculate the forces acting on the vehicle," says Taylor. "We can calculate aerodynamic drag and determine the strength of the vehicle's sonic boom. And we can calculate temperature contours that predict heat transfer, which allows engineers to design appropriate shielding and paint."

Part 3: A Scalable, Parallel Solution

The Delta II simulations used a computational grid composed of about 4.5 million grid points.
Simulations like those needed for the Delta II require an enormous amount of computing -- months under normal circumstances, but months aren't available. Is there a way to get results more quickly? In work during 1995, Taylor and his student Jerrell Watts and Alan Stagg of Cray Research implemented the flow solver on the CRAY T3D at Pittsburgh Supercomputing Center. Due to the high resolution required for the Delta II computations, the pressing need is memory. The T3D includes 32 billion bytes, which along with its excellent parallel performance should make it possible to get results in a relatively short time span.

PSC responds to the situation by making their T3D available for "dedicated" runs. All 512 T3D processors become a powerful team, working together to compute the complete "plume on" flow field of the Delta II. In a matter of days, the researchers have the results they need. The computations quantify all relevant parameters -- velocity, density, pressure -- in three dimensions.


Close inspection of the pressure contours revealed that plume interactions create a backflow region between the main engine and a booster.
The computed results match closely with a pressure gauge reading from the backflow region of the booster rockets during flight, confirming reliability of the simulations. Using the computed flow-field data, the engineers determine that their analysis is essentially correct: Interactions among the booster plumes create a backflow that causes the component to overheat. Adjustments are made, and in November 1995 the Delta II launches a satellite into correct orbit.


Epilogue

The director of the Delta II program commends the team of researchers and PSC -- a combination of software and hardware ingenuity that has pushed the envelope of high-performance computing and directly contributed to the success of a high-priority national mission. "The T3D saved us," says Taylor. "It reduced the turnaround to two weeks, start to finish. Without the T3D, we wouldn't have been able to have an impact on this project."



Researchers: Stephen Taylor, California Institute of Technology.
Hardware: CRAY T3D
Software: User-developed code.
Keywords: Delta II, backflow, booster rockets, plume on, flow solver, multi-body rocket.

Related Material on the Web:
Delta II Flight Anomaly Investigation
Projects in Scientific Computing, PSC's annual research report.

References, Acknowledgements & Credits