TNS
VOXPOP
Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
0%
No: TypeScript remains the best language for structuring large enterprise applications.
0%
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
0%
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
0%
I don’t know and I don’t care.
0%
Operations

How to Control a Robotic Arm with Your Mind, by Using Machine Learning

Jan 22nd, 2017 6:00am by
Featued image for: How to Control a Robotic Arm with Your Mind, by Using Machine Learning
Images via University of Minnesota.

If you’ve lost the use of your arms, the idea of being able to control a robotic replacement arm with your mind might seem like an awesome idea at first. That is, until you’re told that you would probably need to have some serious surgery to crack open your skull and insert an implant into your brain to actually let you do that.

Fortunately, scientists at the University of Minnesota have developed an alternative: a technique that would allow people to move a robotic appendage around with only their thoughts, without the need for surgery or brain implants. It’s a major step in the development of non-invasive brain-computer interfaces (BCIs), which build a direct communication link between the brain and an external device. Though previous experiments showed that brain-computer interfaces could allow people to control virtual objects like moving a cursor on a screen or a helicopter in a flight simulator, and even real objects like small quadcopters, this study takes it to the next level with real-world implications.

“This is the first time in the world that people can operate a robotic arm to reach and grasp objects in a complex 3D environment using only their thoughts without a brain implant,” said Bin He, a biomedical engineering professor at the University of Minnesota and lead author on the study, which was published earlier this month in Scientific Reports. “Just by imagining moving their arms, they were able to move the robotic arm.”

Watch the hands-free magic happen:

Translating Thoughts with Machine Learning

The team’s brain-computer interface was built using electroencephalography (EEG), a monitoring method that tracks the electrical activity of the brain. In particular, the researchers focused on the motor cortex, the part of the brain that oversees movement. When a person moves or merely thinks about executing a movement, the neurons in this area of the brain will generate an electrical current that can be measured with EEG tools. Using advanced signal processing and machine learning, these “thoughts” are then translated in real-time into commands that will propel the robotic arm.

In this experiment, the team utilized a state-of-the-art EEG cap with 64 electrodes, which measure the electrical brain activity that is created in the motor cortex of the test subject when thinking about movement. As the brain’s neurons are constantly firing, different thoughts about different movements can overlap and result as neural ‘noise’ that needs to be filtered. These overlapping layers of brain activity are mapped using functional magnetic resonance imaging (fMRI) and sorted and analyzed using advanced signal processing techniques, something He and his team explored in previous experiments which laid the foundation for this study.

Prior to maneuvering the robotic arm, test participants were first trained to move virtual objects on a screen with their minds. They then graduated to moving objects from fixed locations, and finally, to moving objects from random locations on a table to a three-tiered shelf with the robotic arm, using their thoughts. The subjects showed remarkable accuracy during the final stage of the tests, demonstrating a success rate of over 70 percent.
brain-computer-interface-robotic-arm-3

Mind-controlled Prosthetics

As one might imagine, the findings could bring hope and more independence to people with neuromuscular disorders, or who have suffered strokes, paralysis and spinal cord injuries. Despite losing the ability to control their muscles, these individuals’ brains are still capable of producing the neural signals required to control a robotic prosthetic.

“Three years ago, we weren’t sure moving a more complex robotic arm to grasp and move objects using this brain-computer interface technology could even be achieved,” said He. “We’re happily surprised that it worked with a high success rate and in a group of people.”

The researchers are now turning their attention to refining this interface to accommodate mind-controlled robotic prosthetics that would be attached to the body. Without the need for surgery, the risk of infection and complications, such non-invasive brain-computer interfaces may the future path toward easily manipulating not only prosthetic limbs but also assistive exoskeletons that would give more mobility to a rapidly aging population.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.