PSC System Powers Libratus, and More

Jan. 31, 2017

Watch the press conference at the Rivers Casino.

In the “Brains vs. AI” competition at the Rivers Casino in Pittsburgh, a CMU School of Computer Science artificial intelligence program—“AI”—called Libratus beat four of the world’s top players at heads-up, no-limit Texas hold’em poker. Libratus ran on a Pittsburgh Supercomputing Center (PSC) system named Bridges to provide the vast amounts of computing and data needed to achieve this milestone.

Just as a complex application may need a more powerful laptop to run well (or at all), Libratus needed Bridges’ massive computational power (19 million core hours of it) to calculate its strategy. Just as importantly, every night after play was over, Libratus ran on Bridges to refine its strategy and adapt to improvements that the humans were making in their own strategies.

The XSEDE-allocated Bridges is a new type of supercomputer, designed in part for users who are experts in their fields but not computer programmers. It was designed by PSC and realized using components acquired through Hewlett Packard Enterprise (HPE), including components from HPE, Intel, and NVIDIA. Bridges, which is funded by a $17.2M award from the National Science Foundation, is available at no charge for open research and by arrangement for other appropriate uses.

CMU principal investigator Tuomas Sandholm and graduate student Noam Brown worked intensively with XSEDE Extended Collaborative Support Service expert John Urbanic at PSC to optimize their code’s performance on Bridges.

In the tournament, Libratus used approximately 600 of Bridges’ 752 “regular memory” nodes. These nodes each have 128 gigabytes of RAM, or eight times as much memory as a high-end laptop. But just as a computer can run more than one program, Bridges runs applications that accelerate discovery in many research fields, such as the physical sciences, biology, economics, business and policy, and even the humanities.

From the ground up, PSC designed Bridges to be easy to use by researchers who need the power of high performance computing (HPC) and “just want to do their work.” For example, web applications (“gateways”) allow people to solve problems without any programming or HPC expertise, effectively delivering HPC-as-a-Service. Bridges offers the possibility for experts in fields that never before used supercomputers to tackle problems in Big Data and answer questions based on information that no human would live long enough to study by reading it directly.

  • Researchers at Carnegie Mellon University are applying Bridges to understand the economics of including battery storage in the nation’s electricity grid, offering guidance on how to make the grid more efficient and even out energy- and money-wasting periods of peak and low usage.
  • University of Pittsburgh physician-scientists and Carnegie Mellon philosophers are employing Bridges to tease apart which factors actually cause cancer, lung disease and brain function, drawing on genomic, imaging, and other Big Data.
  • Scientists at Marshall University in West Virginia are using Bridges to assemble and study the DNA sequences of the endangered Sumatran rhinoceros and the Narcissus flycatcher—two species whose evolutionary histories promise hints as to how species respond to environmental changes, and how they can survive such changes.
  • Scientists associated with the University of Wisconsin’s IceCube South Pole Neutrino Observatory are using Bridges to simulate the observatory’s likely behavior when it detects neutrinos—elusive particles that offer cues to basic laws of physics and the origin of the Universe.
  • University of Illinois social scientists are leveraging Bridges’ power to search hundreds of thousands of historical documents for clues to the history and life experiences of Black women in the U.S. from the 1700s onward.

Bridges is a “heterogeneous” system: It contains different components that are optimal for performing different types of computation. Thanks to software written at PSC and Intel’s new Omni-Path Architecture, scientists can apply different parts of the supercomputer to different portions of their computational problems, allowing productive reuse of existing applications and accelerating results.

  • Bridges’ computational speed overall is 1.35 Pf/s (Petaflops, or quadrillions of 64-bit floating-point operations per second)—about 7,250 times as fast as a high-end laptop. The system’s total memory is 274 TB (or trillions of bytes)—about 17,500 times the RAM in a high-end laptop.
  • Bridges’ 800 regular memory nodes offer 22,400 computational cores, providing rapid calculations for problems that can be split into small components, such as computing the possibilities inherent in a series of poker hands or the energy use of hundreds of buildings over many time cycles.
  • Its 42 large memory nodes provide 3 terabytes of memory (192 times as much memory as a high-end laptop) and up to 80 cores apiece—making assembly of large genome sequences from small DNA fragments possible in hours rather than days.
  • Four 12-terabyte extreme memory nodes power the most memory-hungry tasks, including assembling the genomes of plants and assembling the genomic sequences of thousands of microbes at the same time, enabling tasks such as achieving higher crop yields and use of oil-hungry microbes living near well pads for environmental cleanup.
  • 48 graphics processing unit (GPU) nodes power Bridges’ capacity for “deep learning” in artificial intelligence and accelerate applications in, for example, biology, chemistry, and materials science.
  • Bridges emphasizes interactivity and supports an extremely wide range of applications, including, for example, data analytic applications using Python, R, Spark, and Hadoop.
  • Bridges supports web-based “gateways” that offer domain-specific computational tools to researchers, allowing them easily and transparently to leverage supercomputing power.
  • Bridges offers 10 PB (petabytes, or thousands of terabytes) of persistent data storage, strengthening its support for advanced data management and community data collections.
  • A virtual tour of Bridges is available at; the complete technical specification of Bridges is available at