bridges anchors simulation of sound waves to manage heat, stress in fluid flow

 

Two-step computation holds promise in improving efficiency, reducing stress in power plants, electronics, off-shore structures

 

Controlling transfer of the heat and momentum of a fluid across the very thin boundary layer between a fluid and a solid object is key to problems as different as generating electricity, cooling electronics and managing wave damage in offshore structures. Scientists from Purdue University used Bridges at PSC and then supercomputers in the XSEDE network, respectively, to build and then run a massive simulation showing how sound waves can be used to control and tune friction between the fluid and the walls of a container–skin friction–and heat transfer. The technique holds promise for engineering more efficient devices with longer service lives.

Why It’s Important

Though we may not think of it on a daily basis, turbulence is a big deal. It’s the key to transferring heat in a power plant so that it generates electricity more efficiently. It can allow us to keep ever-smaller electronics cool and working properly. It can also let engineers better predict how wave motions can damage off-shore structures. In general, controlling turbulence and using it productively can be the difference between efficiency and safety versus our systems getting torn apart.

An important concept in this work is the boundary layer. Whenever a fluid—whether a liquid like water or liquid helium, or a gas like air—passes through a solid channel, a thin layer of fluid sticks to the walls of the channel. This layer helps transfer heat and momentum between the channel and the fluid.

 

“… By understanding the full spectrum of instabilities that occur within the boundary layer, then we can come up with ideas to excite those structures that are conducive to either dynamic or heat-transfer improvements … And by moving them, [we] can then do cool things with them; we can abate the negative impact or … improve or enhance [performance] without the major penalty in terms of shear stress or skin friction.”—Iman Rahbari, Purdue University

 

Image: The boundary layer around a human hand, in a Schlieren photograph that shows heat layers in the air. The boundary layer is the bright green border, most visible on the back of the hand. By Gary Settles – Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=29523610

Research Scientist Iman Rahbari, working in the PETAL team of Guillermo Paniagua at Purdue University, wanted to use acoustic streaming to control the boundary layer and skin friction in a fluid traveling through a narrow channel. The idea was to create sound waves in the fluid to disrupt and change the specific near-wall flow patterns. To begin this work, he built a simulation using a unique one-two computational punch, leveraging the huge memory of PSC’s Bridges platform and then the massive processing capabilities of other systems in XSEDE—Comet at the San Diego Supercomputer Center and Stampede2 at the Texas Advanced Computing Center. PSC is a leading member of XSEDE, which allocated time on all three systems.

How PSC Helped

While Purdue offers powerful campus supercomputing “clusters” to its scientists, Rahbari realized that his simulation would require even more power. Initially setting up the simulation would require the computer to store large amounts of data ready for its processors to work on it. Moving massive data between storage, as in the hard disk of a personal computer, and the processors would create bottlenecks that choke the calculations. Instead, the data needed to be stored in the RAM memory contained in each processing node. Bridges’ 12-terabyte-RAM extreme memory nodes offered nearly a thousand times more memory per node than found in a typical PC, so it was ideal for this step.

In the first stage, the computations find the optimal conditions to create the sound waves and complete the setup for the next step. The next simulation splits into thousands of different pieces, all of them being computed at the same time to make the sim finish in a reasonable amount of time. Comet and then Stampede2, with its tens of thousands of processors, fit the bill for expanding the computation in this second step.

“[The initial simulation] is very memory-intensive rather than CPU-intensive … Our matrices are sparse and large, while solving the eigenvalue becomes larger and larger; our memory cost grows exponentially … Bridges was a massive help for us, we were able to [simulate] one of the largest stability problems ever solved at that time … This is an asset that is very unique to PSC.”—Iman Rahbari, Purdue University

Rahbari set up his simulations to test different speeds of flow and different fluid resistance, or viscosity. He simulated a fluid moving through the channel at Mach 0.75 and 1.5, three-quarters of or one-and-a-half times the speed of sound in the fluid. By testing fluid flows with a Reynolds number of either 3,000 or the less-viscous 6,000, he sampled two interesting viscosity ranges. At the higher Reynolds number, the fluid would flow with high turbulence, with a lot of mixing. At the lower number, the flow would be unstable, with moments and areas of low-friction or “laminar” flow, and moments and areas of turbulence. By vibrating the walls of the channel to create sound waves, he was able to test different sound frequencies and volumes to see the effects on heat transfer and friction with the channel walls.

The simulations showed that he could tune acoustic streaming to maximize heat transfer between the fluid and the walls of the channel while minimizing friction and stress on the container. In the first peer-reviewed study that reported this new technique, he and Paniagua reported their results in the Journal of Fluid Mechanics in 2020. Today they are repeating their experiments in the laboratory, verifying that the computer predictions match real-world performance.