Machine Learning

How to Control a Robotic Arm with Your Mind, by Using Machine Learning

22 Jan 2017 6:00am, by

If you’ve lost the use of your arms, the idea of being able to control a robotic replacement arm with your mind might seem like an awesome idea at first. That is, until you’re told that you would probably need to have some serious surgery to crack open your skull and insert an implant into your brain to actually let you do that.

Fortunately, scientists at the University of Minnesota have developed an alternative: a technique that would allow people to move a robotic appendage around with only their thoughts, without the need for surgery or brain implants. It’s a major step in the development of non-invasive brain-computer interfaces (BCIs), which build a direct communication link between the brain and an external device. Though previous experiments showed that brain-computer interfaces could allow people to control virtual objects like moving a cursor on a screen or a helicopter in a flight simulator, and even real objects like small quadcopters, this study takes it to the next level with real-world implications.

“This is the first time in the world that people can operate a robotic arm to reach and grasp objects in a complex 3D environment using only their thoughts without a brain implant,” said Bin He, a biomedical engineering professor at the University of Minnesota and lead author on the study, which was published earlier this month in Scientific Reports. “Just by imagining moving their arms, they were able to move the robotic arm.”

Watch the hands-free magic happen:

Translating Thoughts with Machine Learning

The team’s brain-computer interface was built using electroencephalography (EEG), a monitoring method that tracks the electrical activity of the brain. In particular, the researchers focused on the motor cortex, the part of the brain that oversees movement. When a person moves or merely thinks about executing a movement, the neurons in this area of the brain will generate an electrical current that can be measured with EEG tools. Using advanced signal processing and machine learning, these “thoughts” are then translated in real-time into commands that will propel the robotic arm.

In this experiment, the team utilized a state-of-the-art EEG cap with 64 electrodes, which measure the electrical brain activity that is created in the motor cortex of the test subject when thinking about movement. As the brain’s neurons are constantly firing, different thoughts about different movements can overlap and result as neural ‘noise’ that needs to be filtered. These overlapping layers of brain activity are mapped using functional magnetic resonance imaging (fMRI) and sorted and analyzed using advanced signal processing techniques, something He and his team explored in previous experiments which laid the foundation for this study.

Prior to maneuvering the robotic arm, test participants were first trained to move virtual objects on a screen with their minds. They then graduated to moving objects from fixed locations, and finally, to moving objects from random locations on a table to a three-tiered shelf with the robotic arm, using their thoughts. The subjects showed remarkable accuracy during the final stage of the tests, demonstrating a success rate of over 70 percent.

Mind-controlled Prosthetics

As one might imagine, the findings could bring hope and more independence to people with neuromuscular disorders, or who have suffered strokes, paralysis and spinal cord injuries. Despite losing the ability to control their muscles, these individuals’ brains are still capable of producing the neural signals required to control a robotic prosthetic.

“Three years ago, we weren’t sure moving a more complex robotic arm to grasp and move objects using this brain-computer interface technology could even be achieved,” said He. “We’re happily surprised that it worked with a high success rate and in a group of people.”

The researchers are now turning their attention to refining this interface to accommodate mind-controlled robotic prosthetics that would be attached to the body. Without the need for surgery, the risk of infection and complications, such non-invasive brain-computer interfaces may the future path toward easily manipulating not only prosthetic limbs but also assistive exoskeletons that would give more mobility to a rapidly aging population.

Images: University of Minnesota

The New Stack is a wholly owned subsidiary of Insight Partners. TNS owner Insight Partners is an investor in the following companies: Shelf, Real.

Participate in The New Stack surveys and be the first to receive the results of our original research.