Turning Gestures into Speech

 A Communication Breakthrough for the Deaf

May 13, 2008

Four ECE students have invented a sensor-equipped glove that can translate gestures into spoken words on a cell phone. Called HandTalk, the project was developed this spring as part of the Embedded Systems Design capstone course taught by Professor Priya Narasimhan. ECE seniors Bhargav Bhat, Jorge Meza and Hemant Sikaria, and an ECE graduate student, Wesley Jin, ran with the idea, resulting in a practical research prototype.

Through her previous Trinetra project, Professor Narasimhan has been working extensively on assistive mobile technologies and embedded systems for the visually impaired and the blind. Motorola funded her work through a $35,000 grant to research mobile assistive technologies and to support projects in the capstone course. The HandTalk project resulted from this collaboration with Motorola.

"I had an idea for transforming or evolving some of these technologies to enable an exchange of information between the deaf and those not literate in ASL," said Narasimhan. "So I approached a team of students in my Embedded Systems course with an idea for a gesture-recognition glove that could be interfaced to a smart phone with onboard text-to-speech software."

The central underlying idea was to allow a deaf person literate in American Sign Language (ASL) to communicate with someone who wasn't deaf and did not know ASL.

The student team explored this idea in the 15-week long course in the Spring 2008 semester, resulting in a practical research prototype that can now detect hand gestures and translate them into words, based on a custom dictionary. The glove's flex/bend sensors detect finger movements and communicate these to an onboard Bluetooth module that dispatches the signature of these finger movements to a smart phone. The smart phone, with its preloaded gesture-to-word translation dictionary, then converts the received signature into text that is displayed on the phone. The text-to-speech software on the phone audibly reads out the displayed words. Thus, a deaf person's hand gestures can be translated into speech, making it possible for a deaf person to communicate with someone who does not know ASL.

The team, working with Professor Narasimhan, intends to continue the development of this prototype to deploy it with the deaf community.

The gesture-recognition glove has recently received press coverage in the Pittsburgh Post-Gazette and was featured by KDKA, the local CBS affiliate. HandTalk also is the hot topic on many blogs that focus on technology and issues for the hearing impaired.

The HandTalk project is one of several created this year in Dr. Narasimhan's Embedded System Design course, in which teams of four have to develop a product prototype in 15 weeks

Hemant Sikaria, right, and Jorge L. Meza use a cell phone to check on the status of the glove. Copyright, Pittsburgh Post-Gazette, 2008, all rights reserved. Reprinted with permission.

Related People:

Priya Narasimhan

Related Links:

Pittsburgh Post-Gazette coverage

KDKA coverage