Gloves with sensors are turning sign language gestures into the spoken word via a smartphone. Sign language is the way people with hearing loss are able to communicate with the rest of the world. A new technology is emerging for hearing individuals who don’t understand the special technique of talking with your hands.

Students from the Ukraine have created a method, called Enable Talk that translates sign language into speech. As first prize winners in Microsoft’s annual Imagine Cup this year in Australia, they took home $25,000 and a dream of actually giving the deaf a means of communicating with everyone else.

Their prototype is complicated. The gloves have 11 flex sensors, eight touch sensors, an accelerometer/compass and accelerometer gyroscope built in. The data is transmitted to a controller with a Bluetooth connection to a mobile phone. The phone’s API provides a computerized voice that translates the movements into speech. Microsoft Speech API and Bing API are integral to the project.

The system first recognized finger spelling, but is being expanded to recognize symbols in the American Sign Language genre. In sign language, fingers shape letters to tediously spell out words, while hand motions indicate whole words or emotions. The word Computer, for example, is indicated by cupping your hand to make the letter C and moving your hand back and forth along your forearm.

Thank You is indicated by moving your hand out and down similar to the gesture of kissing ones hand and extending the hand towards someone else to show your gratitude. You may have seen Marlee Matlin make this gesture in the many movies and TV series, such as CSI: New York, in which she appears. She has said, and the Ukraine students apparently believe, ?The handicap of deafness is not in the ear; it is in the mind.?

A video shows how you would to tell your deaf Chinese friend, using sign language, that you were born in the Year of the Goat. The demonstration shows that your movement indicates the goat’s beard and horns. Quite a complex movement for a set of sensors to capture and for the software to translate into the word.

Language, whether spoken or signed, must be taken in context. Translating subtleties and nuances of hand movement and meaning is a challenge for the Ukraine team. One motion can have several meanings depending on context. Also one word can be signaled by several different symbols depending on the region in which it is used.

Developing the talking hand

Developing the talking hand

Each ?dialect? must be programmed into the computer. If you’ve ever used Dragon Dictate, you know that you must ?teach? the software how your voice sounds as it pronounces the words. Likewise, the way an individual gestures or how fast their movements are must be taken into consideration by the Enable Talk system.

The system sends a movement pattern to the mobile device and the application matches the incoming pattern with stored signs and replaces the movement with sound and displays the related text. Their video shows how the underpinnings of Enable Talk work.

Like learning to talk or to walk, Enable Talk, must take baby steps before it becomes a universally viable commercial product. But what a big step in the right direction for opening up communication between hearing enabled and speech or hearing disabled individuals.