Where are you using WebAssembly?
Wasm promises to let developers build once and run anywhere. Are you using it yet?
At work, for production apps
At work, but not for production apps
I don’t use WebAssembly but expect to when the technology matures
I have no plans to use WebAssembly
No plans and I get mad whenever I see the buzzword
Edge Computing

Researchers Use AI to Give Amputees ‘Shared Control’ of Neuroprosthetic Hand

Sep 19th, 2019 2:00pm by
Featued image for: Researchers Use AI to Give Amputees ‘Shared Control’ of Neuroprosthetic Hand

The loss of a limb is a life-changing event that can have major impacts on all areas of one’s life. But today’s new and improved prosthetics might ease the process of adjustment somewhat, thanks to recent advancements like 3D printed artificial limbs that are both affordable and tailored to the user, as well as the addition of AI algorithms that can help “tune” prosthetics quickly and efficiently, and other machine learning techniques that allow users to control robotic limbs with a mere thought.

Innovations such as these are opening up new fields like neuroprosthetics, which interweaves elements from neuroscience with biomedical engineering to create devices that can replace or augment a damaged motor, sensory or cognitive ability — like cochlear implants that process auditory signals using a microelectrode array.

But even with all of these enhancements, controlling a prosthetic limb still isn’t easy for amputees when it comes to gripping something with precision. To tackle this problem, researchers at Switzerland’s École Polytechnique Fédérale de Lausanne (EPFL) are using machine learning to give users of prosthetic hands better control over each individual finger, while also automating the process of grasping and manipulation. Hear them explain how this system for “shared control” works:

The idea here is to have AI translate and augment the intended movement of the prosthetic user, especially in cases where the user’s muscular activity isn’t sufficient to complete a task, like gripping a bottle. Perhaps the person might lose strength in their grasp, which means they would only have milliseconds to react and readjust their grip before the object starts to fall. That’s where AI can step in to provide assistance, automatically interpreting the user’s muscular signals to ensure that the robotic hand won’t let go when it isn’t supposed to, even before the human brain might perceive that the bottle is about to slip.

“Because muscle signals can be noisy, we need a machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements,” explained Katie Zhuang, first author of the study that was recently published in Nature Machine Intelligence.

The system works by first having the prosthetic wearer “train” a machine learning model via a series of hand movements, while sensors that are placed on the end of a user’s amputated limb collect data on muscular activity. The algorithm is then able to learn how to decipher the user’s intention in real-time by analyzing and identifying patterns in this data, which then allows it to predict what the user’s intended hand movements will be. Once these patterns are learned, this permits the system to help users more accurately control individual fingers of the artificial hand, thus providing the user with much more dexterity than a conventional prosthesis might.

But merely predicting movements is only the start: The next step was to tweak the algorithm further so that an automation process occurs when the prosthesis wearer initiates an attempt to pick up an object. Thanks to the pressure sensors that line each of the fingers on the artificial hand, the algorithm will send a signal for the artificial hand to close when the fingers actually touch that object. It’s a handy (no pun intended) feature that the team developed, based on previous work that dealt with giving robotic limbs the ability to detect and grab objects using data gleaned only from the sense of touch alone. When the person wants to release the object, they will initiate the movement to let go, and the algorithm will cede control back to the user. Thus, with the sensors on the user’s arm, robotic hand and algorithm working in tandem, the process is automated somewhat, permitting the user to grasp target objects easily with an adequate amount of force, and without worrying that they will drop it accidentally.

According to the team, their smart artificial limb was successfully implemented with three amputee and seven able-bodied test subjects. Though it will be some time before this system for “shared control” is ready to be out on the commercial market, the researchers envision that it would be useful in brain-to-machine interfaces, as well as other bionic prosthetics where some degree of automation and assisted movement is needed. As our population ages, one can imagine such AI-assisted devices helping not just amputees, but perhaps those too with limited mobility or other age-related conditions improve their overall quality of life.

Read more at Nature Machine Intelligence.

Images: EPFL

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.