A decades-old question is whether the mathematical laws believed to govern turbulent diffusion hold true. The main problem in testing them has been the immense range of scales involved in some of nature's turbulent systems. The mixing and flow of salinity in the oceans, for example, includes tiny eddies in the tidal pools of a remote cove and currents like the Gulf Stream that span continents. These large systems can't be modeled in a laboratory, and computational models have been limited by algorithms that can deal only with a comparatively small range of scales, on the order of from 1 to 1,000. To test the universal laws over a turbulent diffusion expanse that occurs in nature requires handling a scale range of 1 to 100,000, with the number of variables -- and the amount of computation -- increasing exponentially with each increase.

Using a creative mathematical approach, New York University researchers Andrew Majda and Frank Elliott have carried out computer modeling that represents a major advance in the ability to study turbulent diffusion across the vast range of scales that comprise such systems in nature. Tapping the parallel processing power of Pittsburgh Supercomputing Center's CRAY T3D and measuring the results against meteorological data, they've shown that the universal laws hold true, and their landmark work offers potential for accurately modeling extremely large, complicated flow systems. "Turbulence is a very difficult problem," says Majda, "but it's a key piece of the atmospheric puzzle. Our model helps put some pieces in place for the first time in idealized circumstances."

The problem with existing models is that the algorithms employed to follow the movements of two particles while simultaneously assessing turbulence data around them can only track the particles for a relatively short time, a few meters as opposed to the thousands of kilometers needed to account for large flow systems. The information loads grow exponentially with each jump to a larger scale, because the calculations try to assess turbulence in all the eddies of each scale. (Each scale represents all the eddies of a particular size.)

The problem with existing models is that the algorithms employed to follow the movements of two particles while simultaneously assessing turbulence data around them can only track the particles for a relatively short time, a few meters as opposed to the thousands of kilometers needed to account for large flow systems. The information loads grow exponentially with each jump to a larger scale, because the calculations try to assess turbulence in all the eddies of each scale. (Each scale represents all the eddies of a particular size.)

"It allows us to use only the necessary data and in a sense, filter out the superfluous noise that limits the number of scales you can examine," says Elliott. "It's like drawing a picture using only lines -- it gets the essential information across."

"We're interested in their distance from each other -- how rapidly that grows as the particles move away from each other," says Elliott. "That reveals the characteristics of the turbulent system, which relates to the transport of tracers like pollutants, heat or water droplets."

The modeling technique, Monte Carlo simulation, involves independent repetitions of the same experiment. If these repetitions are run simultaneously on different processors, says Elliott, then the code achieves perfect scaling. In other words, doubling the number of processors halves the running time. "Because the algorithm requires small memory and because of the large number of available T3D processors," says Elliott, "we can run many experiments simultaneously, while sampling the particle position intermittently allows the program to run economically."

Majda and Elliott believe their model possibly could be adapted for large-scale endeavors, such as reproducing weather patterns or the mechanisms involved in cloud formation. "In cloud formation modeling for instance," explains Elliott, "we could use our technique to move droplets of liquid water around a cloud, integrate that with simulations of the other forces at work in these scenarios, and use that to understand the mechanisms responsible for producing rain."

Frank W. Elliott, New York University.

Andrew Majda

Courant Institute of Mathematical Sciences at New York University

Projects in Scientific Computing, PSC's annual research report.

**References, Acknowledgements & Credits**