
Students Install and Benchmark Quantum Espresso (QE) code on Bridges-2 for the Final Challenge of Cluster Competition
Students at “non-R1” institutions — those not at the highest levels of research output — often don’t get as much opportunity to take part in cutting-edge science, technology, engineering, and math (STEM) activities as their R1 peers. This year, PSC served as the “Mystery Mentor” for the Winter Classic Invitational Student Cluster Competition, mentoring student teams from non-R1 schools in installing and using a leading open-source electronic structure package, QE, on the center’s flagship Bridges-2 system.
WHY IT’S IMPORTANT
It’s not exactly news to say that jobs in the STEM fields are among the most promising careers for young people to explore. Students at colleges and universities that aren’t among the highest producers of research — the non-R1 institutions — don’t always have the opportunity to experience STEM at the highest technical levels. In particular, instruction in basic STEM tools like the Linux operating system, let alone advanced ones such as high performance computing (HPC), may not be available.
That’s why long-time HPC technology and marketing analyst Dan Olds started the Winter Classic Invitational Student Cluster Competition in 2020. The idea was to bring together teams from non-R1 schools together and give them a series of high-performance computing tasks to do that would challenge them, expand their abilities, and open them up to career possibilities in STEM.
For the first time this year, the Winter Classic featured a “Mystery Mentor” for its last challenge. This out-of-the box, unexpected mentor site hosted the last of six computing competitions. This Mystery Mentor was PSC, offering a window to the complex world of quantum physics and chemistry via the center’s NSF-funded Bridges-2 supercomputer.

Valerie Rossi
Manager of Education and Student Programming
“Dan Olds found our former YouTube channel that had all of John [Urbanic’s] and Tom Maiden’s XSEDE training videos. And of course Dan was blown out of the water by them and asked if PSC could participate in some way … and so Sergiu [Sanielevici, PSC’s director of support for scientific applications], Niraj, and I met with Dan and that’s how PSC got involved [as a mentor] … It was really cool for me — being that I’m the advisor [for PSC teams in] cluster competitions, I never get to see behind the scenes … of how the challenges are created.”
— Valerie Rossi, PSC
HOW PSC HELPED
Olds’s interest in PSC being a mentor site began when he saw PSC’s online 2021 Summer Boot Camp playlist on YouTube. This was a series of online classes that PSC’s John Urbanic and Tom Maiden made to help students (and later-career scientists!) learn the supercomputing ropes. The videos, prepared for NSF’s XSEDE program, remain a useful tool today. (XSEDE, for which PSC and other NSF-funded supercomputing sites coordinated to produce a “one-stop shop” for scientists to use advanced computing, has today been superseded by the ACCESS program.)
Valerie Rossi, PSC manager of education and student programming, and Dr. Niraj Nepal, PSC senior computational scientist, realized that the 12 student teams would need more.

Niraj Nepal, Ph.D.
Senior Computational Scientist
“Within my third week at PSC, Valerie and Sergiu entrusted me with an unexpected opportunity that broadened my perspective and significantly enhanced my development as an educator. I was from a basic science energy research background, so doing HPC-related education activities was quite new to me … Many of them didn’t even have Linux knowledge before the competition [but] they did exceptionally well, distributing their work [among teammates], participating in the assignments, and summarizing their results.”
— Niraj Nepal, PSC
The task that they would tackle — to get the quantum physics and chemistry program Quantum Espresso to run on Bridges-2 — would require a series of steps. These started with installing Quantum Espresso on the supercomputer. The students then performed calculations to benchmark the performance of QE code, using a surface of 112 gold atoms. They first optimized QE on Bridges-2’s central processing units (CPUs) nodes. Then they repeated the work on graphics processing units (GPUs) using Bridges-2’s late-model V100-32 nodes. GPUs recently have allowed physicists and chemists to accelerate simulation of much more complex scientific processes. Nepal introduced QE and density functional theory (DFT) as implemented in the code and also helped the students with the tech side of the problem, mentoring them on how to use the software and the PSC cluster. He also talked them through the inevitable hiccups which occur when even the most experienced scientists work with supercomputers.
At the end of March, Nepal, with Rossi’s help, judged the teams’ reports on their work. Texas Tech University’s Team Matador came out on top, crushing it with the maximum 100 points. Next came a three-way tie between Fayetteville State University’s Bronco-1, Texas Tech’s Team Red Raider, and University of California Santa Cruz’s Not So Slow Slugs, with 90 points each. In the final tallies, 10 of the teams took part in the final challenge (the others had been sucked into final exams and couldn’t attend). Team Red Raider finished at the top of the heap, with Not So Slow Slugs taking second place and Team Matador in third.
You can find out more about the competition and its final results here.