PSC Recognizes Fox Chapel Students for Computation in Science Projects

Projects Use Latest Technology to Help Disabled, Visually Impaired People

Wednesday, May 28, 2014

Science projects by two students from Fox Chapel Area Senior High School, Pittsburgh, have earned special recognition from Pittsburgh Supercomputing Center (PSC) for excellence in computer programming.

Sonia Appasamy designed a computerized tool that helps people with visual impairment by automatically converting a camera feed or a photographic image to a simplified, more cartoon-like image with sharper boundaries and contrasts. Suvir Mirchandani created a web browsing system that scans eye movement and brain waves to allow people who do not have use of their arms to surf the Internet.


“These students used pretty simple ideas to come up with tools that are complex and effective,” says Anirban Jana, senior computational scientist at PSC, who selected the winners from among the entrants of the Pittsburgh Regional Science and Engineering Fair in March. While the students’ use of advanced technology was remarkable, he adds, an important differentiator was their grasp of the technologies they were using and their ability to explain them.

In her project, “Tool for the Visually Impaired: Digital Environment Enhancement,” Appasamy used an Xbox Kinect system to capture and manipulate images. Employing a palette of four colors chosen by the user, her system uses an algorithm to define objects, and then change their colors to make them better stand out from the background. The end result is a cartoon-like image that people with visual impairment can see far more easily.

“There are over 285 million people in the world who are visually impaired,” she says. “I felt a project that could help them might change the lives of millions of people.” Appasamy got the idea working with a visually impaired programming teacher in Andrew’s Leap, a Carnegie Mellon University School of Computer Science summer enrichment program for high school students, and who helped her test the system. She plans to test the system with more visually impaired users, so she can publish it for general use.

In “Fuzzy Logic Based Web Browser for the Disabled,” Mirchandani tackled the problem of allowing people who can’t use their arms or hands to search and select objects on the Web easily and economically.

“The first challenge was just determining how I could approach this kind of problem, what hardware I could use while keeping the system low-cost,” he says. He developed an eye-tracking algorithm to monitor what the user is looking at on a given Web page. He used an electroencephalogram (EEG) headset to allow his system to monitor “beta” brain waves, a measure of concentration, to help the system determine when a user wants to click on a link as opposed to just reading it.

“I had to design an algorithm to determine users’ intent” to click links, Mirchandani says. “Sometimes a user is just browsing around and doesn’t want to click on a link.” Earlier this month, the project also won the Web Innovator Award sponsored by GoDaddy at the Intel International Science and Engineering Fair in Los Angeles. Among the possibilities for releasing his system would be to offer it via the Apple App Store, he adds.