BIG MIXUPS

Their work represents a major advance in the ability to study turbulent diffusion across a vast range of scales.

Tackling the Scale Problem

Perfume spreading through a crowded room, an oil spill mixing with ocean currents, Arctic air bringing winter south in the jet stream across the Great Lakes -- turbulent diffusion is ubiquitous. At small and large scales, from a teaspoon of cream in hot coffee to the global atmospheric patterns that affect climate change, the effects are the same -- random fluctuations, swirls and eddies that accelerate the mixing of one substance with another.

A decades-old question is whether the mathematical laws believed to govern turbulent diffusion hold true. The main problem in testing them has been the immense range of scales involved in some of nature's turbulent systems. The mixing and flow of salinity in the oceans, for example, includes tiny eddies in the tidal pools of a remote cove and currents like the Gulf Stream that span continents. These large systems can't be modeled in a laboratory, and computational models have been limited by algorithms that can deal only with a comparatively small range of scales, on the order of from 1 to 1,000. To test the universal laws over a turbulent diffusion expanse that occurs in nature requires handling a scale range of 1 to 100,000, with the number of variables -- and the amount of computation -- increasing exponentially with each increase.

Using a creative mathematical approach, New York University researchers Andrew Majda and Frank Elliott have carried out computer modeling that represents a major advance in the ability to study turbulent diffusion across the vast range of scales that comprise such systems in nature. Tapping the parallel processing power of Pittsburgh Supercomputing Center's CRAY T3D and measuring the results against meteorological data, they've shown that the universal laws hold true, and their landmark work offers potential for accurately modeling extremely large, complicated flow systems. "Turbulence is a very difficult problem," says Majda, "but it's a key piece of the atmospheric puzzle. Our model helps put some pieces in place for the first time in idealized circumstances."

Sketching to Scale

Turbulent diffusion is driven by an immense collection of interacting, similarly shaped whorls or eddies, ranging in size from millimeters to miles, whose collective effect is measured as a velocity field. The Majda-Elliott approach charts the effects of turbulence on two identical particles released into a modeled velocity field. While the starting point is nearly the same for both, diffusion quickly places the particles on divergent paths, for example sending one east and the other west, as they swirl from one eddy to the next. The system is fractal-like -- at every scale, the eddies resemble each other. "Ordinarily it's a very hard thing to model," says Majda. "The idea is to choose an approach that is computationally fast and accurate in that it respects the fractal structure."

The problem with existing models is that the algorithms employed to follow the movements of two particles while simultaneously assessing turbulence data around them can only track the particles for a relatively short time, a few meters as opposed to the thousands of kilometers needed to account for large flow systems. The information loads grow exponentially with each jump to a larger scale, because the calculations try to assess turbulence in all the eddies of each scale. (Each scale represents all the eddies of a particular size.)

The problem with existing models is that the algorithms employed to follow the movements of two particles while simultaneously assessing turbulence data around them can only track the particles for a relatively short time, a few meters as opposed to the thousands of kilometers needed to account for large flow systems. The information loads grow exponentially with each jump to a larger scale, because the calculations try to assess turbulence in all the eddies of each scale. (Each scale represents all the eddies of a particular size.)

"It allows us to use only the necessary data and in a sense, filter out the superfluous noise that limits the number of scales you can examine," says Elliott. "It's like drawing a picture using only lines -- it gets the essential information across."

Parallel Power

Majda and Elliott sought out the T3D for its processing power, which allows it to run millions of independent calculations and then pool the results. Using an approach that provides for the random forces at work in diffusion while taking into account the universal laws, their model constructs 1,024 different velocity fields. Each of those runs produces snapshots of every scale, incrementally charting the progress of the particles as they move away from one another. The resulting data are used to build a statistical framework that helps illuminate dispersion paths.

Dispersion of Particle Pairs in a Velocity Field
The fundamental quantity for describing dispersion of particle pairs in a given velocity field is the difference in velocity between any two points. These three graphs show the mean square value of this difference (called the structure function) in relation to distance between the points. In each graph, the solid line shows the theoretical prediction (which says the mean square velocity difference is proportional to separation distance to the 2/3 power). The diamond-shaped points show simulated values of this statistic from averaging over 10 experiments (a), 100 experiments (b) and 1,000 experiments (c). In all cases the measured statistics agree well with theory. This agreement, moreover, ranges from separations of about 10-12 to 1, indicating that this technique can simulate a turbulent velocity field over a range of scales from a millimeter to a million kilometers.

"We're interested in their distance from each other -- how rapidly that grows as the particles move away from each other," says Elliott. "That reveals the characteristics of the turbulent system, which relates to the transport of tracers like pollutants, heat or water droplets."

The modeling technique, Monte Carlo simulation, involves independent repetitions of the same experiment. If these repetitions are run simultaneously on different processors, says Elliott, then the code achieves perfect scaling. In other words, doubling the number of processors halves the running time. "Because the algorithm requires small memory and because of the large number of available T3D processors," says Elliott, "we can run many experiments simultaneously, while sampling the particle position intermittently allows the program to run economically."

Majda and Elliott believe their model possibly could be adapted for large-scale endeavors, such as reproducing weather patterns or the mechanisms involved in cloud formation. "In cloud formation modeling for instance," explains Elliott, "we could use our technique to move droplets of liquid water around a cloud, integrate that with simulations of the other forces at work in these scenarios, and use that to understand the mechanisms responsible for producing rain."



Researchers: Andrew J. Majda, New York University.
Frank W. Elliott, New York University.
Hardware: CRAY T3D
Software: User-developed code.
Keywords: tubulent diffusion, flow systems, velocity field, fractal structure, eddies, dispersion paths, Monte Carlo simulation, particle dispersion, separation distance, velocity structure function.

Related Material on the Web:
Andrew Majda
Courant Institute of Mathematical Sciences at New York University
Projects in Scientific Computing, PSC's annual research report.

References, Acknowledgements & Credits