Bridges-2 Webinar Series
Quantum Machine Learning Workflows: What HPC Enables and What Quantum Hardware Delivers
April 3, 2026
1:00 pm – 2:00 pm Eastern time
Speaker: Daniel Justice, Software Engineering Institute (SEI), Carnegie Mellon University
Abstract: Quantum Machine Learning (QML) is often presented as a near-term application of quantum computing, yet practical deployment remains constrained by hardware limitations and workflow complexity. In this talk, we examine how QML models are actually developed and evaluated today, with a focus on the interplay between high-performance computing (HPC) systems and emerging quantum hardware.
We walk through a real-world case study deploying a quantum convolutional neural network (QCNN) across classical simulators and multiple quantum processing units (QPUs). While simulations achieve strong performance, results on hardware show significant degradation and variability, highlighting the impact of noise, compilation strategies, and device-specific constraints.
Using this example, we outline where HPC resources play a critical role in QML, including large-scale simulation, hybrid optimization loops, and parameter exploration. We also discuss current limitations in reproducibility, benchmarking, and model portability across quantum systems.
The goal of this talk is to provide a grounded perspective on what is possible today, and to help HPC practitioners understand how systems like PSC’s Bridges-2 can support quantum workflows as the field continues to evolve.
Speaker Bio: Daniel Justice is a researcher in the AI Division at the Software Engineering Institute (SEI) at Carnegie Mellon University. His work focuses on the intersection of quantum computing, machine learning, and large-scale systems, with an emphasis on evaluating how emerging quantum methods perform in practical settings.
He has led efforts to benchmark quantum machine learning models across simulators and multiple quantum hardware platforms, studying the impact of noise, compilation, and system-level constraints on model performance. In addition to his research, he teaches a graduate course on quantum computing, cryptography, and machine learning at Carnegie Mellon.
His broader interests include hybrid quantum-classical workflows, infrastructure for reproducible quantum experiments, and bridging the gap between theoretical promise and real-world deployment.