PSC computational scientist consultants work closely with discipline scientists to provide tools for visualization and analysis

PSC scientists Nick Nystrom, director of strategic applications, Kent Eschenberg, Joel Welling and John Urbanic, who have consulted and provided visualization support for a number of research projects at PSC.

When it comes to a scientist’s desire to simulate physical phenomena in realistic detail, there may be no such thing as “too much information.” To do simulations with massively parallel computational engines like PSC’s Cray XT3, nevertheless, brings enormous challenges in analysis and interpretation of the information produced. Regardless of the target problem — fluid dynamics of blood flow in an artery, galaxy formation in the early universe, evolution of tornados, and many others — such computational experiments often produce terabytes of data.

How much information is that? One terabyte is a trillion bytes. In printed form, that’s about 50,000 trees worth of paper. Ten terabytes would represent the entire printed contents of the U.S. Library of Congress.

How is it possible to assimilate, analyze, and arrive at conclusions from such mountains of information? A large part of the answer is “scientific visualization” — a range of software tools that can manipulate data from simulations and present it in visual form. Looking at images rather than piles of numbers, scientists rely on the brain’s natural ability to interpret visual information. Much as if they are observing a phenomenon in nature, they can see what happens in their simulation and extract insights in seconds that might otherwise take days or weeks.

PSC scientist consultants provide visualization support for many projects that use PSC resources. They often collaborate closely with researchers to devise the best means to represent data so that it will facilitate interpretation and analysis. Within the past year, PSC consultants have provided this kind of support for a number of major projects including ocean modeling, earthquake soil vibration, and turbulent blood flow in human arteries.

Oceans and Climate

Zulema Garraffo
University of Miami Rosensteil School of Marine and Atmospheric Science

A team of researchers from the University of Miami Rosensteil School of Marine and Atmospheric Science and the Naval Research Laboratory are using PSC’s Cray XT3 to simulate ocean climate variability. Their focus is the Atlantic Ocean. Using large-scale models that run for months of XT3 processing time, they simulate the ocean over decades, with the goal of developing realistic models that can couple with atmospheric models and be used to forecast climate change.

Building on a 13-year PSC history of collaboration in ocean modeling, PSC scientist consultants John Urbanic and Kent Eschenberg worked with principal investigator Zulema Garraffo and her colleagues George Halliwell and Alan Wallcraft to develop a 3D visualization capability for their Hybrid Coordinate Ocean Model (HYCOM). Although ocean currents are inherently 3D, HYCOM has until this year relied entirely on 2D contour plots.

Drawing from the Visualization ToolKit (VTK), an open source system for 3D graphics, and ParaView (Parallel Visualization Application), an open-source application that supports parallel systems and large datasets and that provides a user interface for VTK, Eschenberg created a “reader” for HYCOM’s hybrid coordinate grid format. Garraffo and her colleagues are integrating PSC’s HYCOM reader into an automated workflow for remote visualization of simulation data.

The PSC HYCOM Reader
A view of the Gulf of Mexico from underneath, with 3D perspective showing the ocean surface, where water is warmest, and a vertical cross-sectional slice showing temperature (increasing from blue to red) as it varies with depth (the vertical dimension is stretched to aid observation). Areas of no color (black) represent land mass, including the island of Cuba. Sampling along a line between top and bottom spheres (white) generates the plot of temperature versus distance.

Deep Currents
This section of the Atlantic Ocean off the U.S. east coast shows velocity magnitude by color [increasing from blue to red] and velocity direction with arrow-shaped glyphs, that allow visualization of the 3D features of the ocean current.


Simulated Quakes

Jacobo Bielak (left) and David O’Hallaron

Jacobo Bielak and his colleague David O’Hallaron of Carnegie Mellon University lead the Quake Project, a large collaborative research team that uses sophisticated computational methods to create realistic 3D models of earthquakes. In collaboration with the Southern California Earthquake Center (SCEC), their work aims to provide information that will result in seismic provisions in building codes that will ensure the safest possible structures at reasonable cost.

In November 2006, the Quake team won the HPC Analytics Challenge Award at SC06 in Tampa for “Hercules” — software that coordinates all the stages of large-scale earthquake simulation, from problem definition to final visualization. With this unified framework, all tasks — building a software mesh that subdivides the quake region, partitioning the job among hundreds or thousands of processors, the simulation itself, and visualizing results — are performed in place on the XT3. Relying on software called PDIO developed by PSC staff (see p. 35), Hercules can visualize results in real time as a simulation is running.

Soil Displacement
This graphic shows the displacement from soil vibration in a Hercules run (above) compared to the same region from a TeraShake run (below).

In 2007, PSC worked with the Quake group and SCEC to develop tools that compare results between Hercules and TeraShake, earthquake simulation software developed at SCEC using a technique that differs from Hercules. To show how and why results from the two programs vary is important for validation of the models. It requires sophisticated statistical tools, which PSC scientist Joel Welling has developed, along with a comparison viewing capability provided by PSC’s Kent Eschenberg, and a new Quake reader for ParaView optimized for their files.

Statistical Comparison
The top two planes of this display, developed by Welling, show a “phase mismatch” and an “envelope mismatch” between two simulations, while the third plane represents a difference between the first two planes. “Joel Welling’s statistical comparison toolset,” says Bielak, “provides an outstanding means for assessing the quality of our seismic synthetic datasets against actual earthquake records and for comparing results from different simulation techniques. It greatly facilitates our verification and validation efforts.”

The Arterial Tree
“With these improvements, what would take two hours takes only seconds.”

Aortic Arch Flow through the aortic arch and branching into the brachiocephalic, carotid and other downstream arteries, showing flow velocity (increasing from blue to red).

To model blood flow interactions in different regions of the human cardiovascular system is the goal of research led by George Karniadakis and Leopold Grinberg of Brown University. Their applied mathematics group at Brown, the CRUNCH group, develops algorithms, visualization methods and parallel software for simulations in fluid mechanics. They aim to establish a biomechanics gateway on the TeraGrid with the arterial tree as a simulation framework for research in hemodynamics, disease and drug delivery.

To solve the very complex 3D flow problems of the arterial tree — complicated by many branches and outlets — Grinberg uses NekTar, a program developed at Brown, applied to realistic 3D geometry of the arterial system reconstructed from MRI. Grinberg demonstrated remote visualization with NekTar at SC06 in Tampa. Using PDIO, developed at PSC, to stream data from PSC’s XT3 to a remote desktop computer led to a 100- fold speedup in display, compared to the standard data-transfer technique. Such interactive visualizations are a convenient way to present results of blood-flow simulations to doctors.

PSC consultants collaborated with Grinberg to help in visualizing fast solution of NekTar simulations, which can involve as many as a billion degrees of freedom and produce up to 1.5 terabytes of data per run. Recent PSC optimizations reduce the overall data volume by more than 80-percent while preserving resolution of the visualization. “With these improvements,” says Grinberg, “what would take two hours takes only seconds.”


© Pittsburgh Supercomputing Center, Carnegie Mellon University, University of Pittsburgh
300 S. Craig Street, Pittsburgh, PA 15213 Phone: 412.268.4960 Fax: 412.268.5832

This page last updated: May 18, 2012