Pittsburgh Supercomputing Center
News Release

February 8, 2001

Virtualized Reality 3-man basketball

Three-man basketball is one demonstration of Carnegie Mellon's patented Virtualized RealityTM dynamic event-modeling technology, developed byTakeo Kanade. In a fully digital 3D Room at Carnegie Mellon's Robotics Institute, an array of video cameras records the action, which is then reconstructed so that the viewer can select a point of view independent of camera location. Using PSC's Terascale Computing System, Kanade plans to expand this approach to stadium scale in time for the 2002 World Cup in Japan.

Super Bowl Replay Technology Draws on Carnegie Mellon and Pittsburgh Supercomputing Center Expertise

PITTSBURGH — This year's Super Bowl broadcast introduced an instant replay technology, called Eye Vision, that allowed viewers to see a play as if time is frozen while a camera circles around the action. It's an impressive gee-whiz effect that helps to resolve difficult instant-replay calls. To develop Eye Vision, CBS turned to Carnegie Mellon University scientist Takeo Kanade, who in turn drew on the parallel systems and visualization expertise of the Pittsburgh Supercomputing Center.

One day last spring, CBS personnel visited Pittsburgh to see Kanade, who heads Carnegie Mellon's Robotics Institute, about developing a robotic multi-camera system. With his computer-vision and system design wizardry and programmers and software designers from Carnegie Mellon and PSC, a crash project overcame hardware and software obstacles in time for Super Sunday.

"The challenge was to get the entire system of many robotic camera-heads to perform in real time," says Kanade. "You want to look at an athlete sprinting down the field. You have 30 cameras spaced around the upper deck of the stadium, and some of them are going to be 50 meters away and some 150 meters away from the point of action. All 30 cameras must point precisely at the same time at the same point, in focus, with the same relative field of vision. Getting 30 cameras to do this, when they're outdoors in a stadium, exposed to the weather — not a laboratory setting — is, to begin with, a problem in selection of hardware."

The pan-tilt mounts on normal security cameras don't provide adequate control, so Kanade turned to Mitsubishi Heavy Industries, which adapted a robotic arm used in the auto industry.

Carnegie Mellon research scientists Robert Collins and Omead Amidi designed an operating system to control the cameras and Collins and PSC parallel systems engineer John Urbanic did the programming. A human-operated master camera had sensors to record pan-tilt angle, focus and zoom. The master camera feeds this information to a central computer, which computes the appropriate control signal for the other cameras and sends it to them. They each record an image at the same time and send it to very fast video disc, one for each camera, for controlled playback.

The entire system completes one of these cycles 33 times every second, and the synchronization required imposes a "real time" systems-design approach. "Real time," says Urbanic, "means the system has to respond precisely in time. Many systems that run very fast, like PSC's CRAY T3E system, which runs thousands of times faster than the PCs used for this project, must periodically slow down for tasks such as input/output. With a real-time system, the processing can never take a break."

The real-time requirements ruled out productivity tools, such as object-oriented code and Java, which trade reduced programmer time for less efficient computer performance. Urbanic's control-system program — done in low-level C, with machine-language drivers for some parts — corresponds closely to the instruction set built into the computer. "When it comes to a system like this," says Urbanic, "a factor of 10 or 100 times faster for a single operation makes a big difference."

Along with the ability to view output from one video disc in time sequence, as in normal instant replay, Eye Vision can play back by cycling through the 30 discs as if they were in time sequence, creating the effect familiar to movie-goers who saw The Matrix.

By the end of this year, PSC's new Terascale Computing System will support more advanced visualization approaches, which require more powerful computing than the clustered PCs used in Eye Vision. The TCS, developed pursuant to a $45 million award from the National Science Foundation, is slated to be the most powerful system in the world available for public research.

More on the TCS: http://www.psc.edu/machines/tcs

More on Takeo Kanade's research: http://www.ri.cmu.edu/people/kanade_takeo.html

The Pittsburgh Supercomputing Center is a joint effort of Carnegie Mellon University and the University of Pittsburgh together with the Westinghouse Electric Company. It was established in 1986 and is supported by several federal agencies, the Commonwealth of Pennsylvania and private industry.


Michael Schneider
Pittsburgh Supercomputing Center

Anne Watzman
Carnegie Mellon University

© Pittsburgh Supercomputing Center (PSC)
Revised: February 8, 2001

URL: http://www.psc.edu/publicinfo/news/2001/superbowl-02-08-01.html