This Smart Glove Interprets Sign Language In Real Time

UCLA smart glove translates sign language
A digital rendering of the system that helps convert sign language into speech. (Image credit: UCLA)

Sign language allows deaf and hard of hearing people to communicate quickly and effectively with others who use sign language. However, many feel awkward while using sign language and they may feel like a foreigner because many people with normal hearing and speech do not understand sign language. Now, bioengineers at UCLA have designed a glove-like device that can translate American Sign Language into English speech in real time through a smartphone app. Their research is published in the journal Nature Electronics.

Read more Kenyan Invents Smart Gloves that Turn Sign Language Gestures into Audio Speech

“Our hope is that this opens up an easy way for people who use sign language to communicate directly with non-signers without needing someone else to translate for them,” said Jun Chen, an assistant professor of bioengineering at the UCLA Samueli School of Engineering and the principal investigator on the research. “In addition, we hope it can help more people learn sign language themselves.”

The system includes a pair of gloves with thin, stretchable sensors that run the length of each of the five fingers. These sensors, made from electrically conducting yarns, pick up hand motions and finger placements that stand for individual letters, numbers, words and phrases, reports UCLA.

The device then turns the finger movements into electrical signals, which are sent to a dollar-coin–sized circuit board worn on the wrist. The board transmits those signals wirelessly to a smartphone that translates them into spoken words at the rate of about a one word per second.

A sensor-embedded glove and a mobile phone
Image credit: UCLA

The researchers tested the device to see if it can interpret American Sign Language. So, they added adhesive sensors to testers’ faces — in between their eyebrows and on one side of their mouths — to capture facial expressions that are a part of American Sign Language.

Previous wearable systems that offered translation from American Sign Language were bulky and heavy, making them uncomfortable to wear, Chen said.

The UCLA device is made from lightweight and inexpensive but long-lasting, stretchable polymers. The electronic sensors are also very flexible and inexpensive.

In testing the device, the researchers worked with four people who are deaf and use American Sign Language. The wearers repeated each hand gesture 15 times. A custom machine-learning algorithm turned these gestures into the letters, numbers and words they represented. The system recognized 660 signs, including each letter of the alphabet and numbers 0 through 9.

Read more Quantic Nanotech Begins Pre-Launch Pilot Trial Tests of First Ever Wearable Osteoarthritis Glove

UCLA has filed for a patent on the technology. A commercial model based on this technology would require added vocabulary and an even faster translation time, Chen said.

Previous articleMaxim’s MAXM86146 Is The Thinnest Optical Sensor Solution For Health and Fitness Wearables
Next articleWT | Studio Talk: OMRON – How Wearables Are Evolving Heart Health Monitoring & Patient Outcomes
Cathy Russey
Cathy Russey () is Online Editor at WT | Wearable Technologies and specialized in writing about the latest medical wearables and enabling technologies on the market. Cathy can be contacted at info(at)wearable-technologies.com.