Smart Artificial Hand Combines User and Robotic Control for Assistive Solution

EPFL Smart Robotic hand
Image: EPFL

Scientists at Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland have developed a smart robotic hand to help amputees in daily tasks. The artificial hand combines individual finger control and automation for improved grasping and manipulation.

Read more Researchers Develop New Prosthesis that Provides Sense of Touch So the User Knows Its Location

This interdisciplinary proof-of-concept between neuroengineering and robotics was successfully tested on three amputees and seven healthy subjects. Implementing these two concepts together, the technology contributes to the emerging field of shared control in neuroprosthetics. The results were published in Nature Machine Intelligence, reports EPFL.

The robotic hand is intelligent enough to decipher the user’s intentions and can grasp an object and maintain contact with it for robust grasping. Such automation may help the system to be more skillful, innate and less cumbersome than previous robotic prostheses.

“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react,” explains Aude Billard who leads EPFL’s Learning Algorithms and Systems Laboratory. “The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.”

Robotic hand
Image credit: EPFL

The machine learning algorithm, developed by the researchers, first learns how to decode user intention and translates this into finger movement of the prosthetic hand. The amputee must perform a series of hand movements in order to train the algorithm that uses machine learning. Sensors placed on the amputee’s stump detect muscular activity, and the algorithm learns which hand movements correspond to which patterns of muscular activity. Once the user’s intended finger movements are understood, this information can be used to control individual fingers of the prosthetic hand.

“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” says Katie Zhuang first author of the publication.

Read more Swedish Woman Receives First Dexterous and Sentient Prosthetic Hand

Next, the scientists engineered the algorithm so that robotic automation kicks in when the user tries to grasp an object. The algorithm tells the prosthetic hand to close its fingers when an object is in contact with sensors on the surface of the prosthetic hand.

Previous articleKaasa Releases Data Collector 2.0, an iOS App to Help Record and Manage Movesense Raw Data
Next articleProGlove Gets $40M Investment from Summit Partners to Deliver Industrial Wearables Globally
Sam Draper
Sam Draper () is Online Editor at WT | Wearable Technologies specialized in the field of sports and fitness but also passionated about any new lifestyle gadget on the market. Sam can be contacted at press(at)wearable-technologies.com.