TNS
VOXPOP
How has the recent turmoil within the OpenAI offices changed your plans to use GPT in a business process or product in 2024?
Increased uncertainty means we are more likely to evaluate alternative AI chatbots and LLMs.
0%
No change in plans, though we will keep an eye on the situation.
0%
With Sam Altman back in charge, we are more likely to go all-in with GPT and LLMs.
0%
What recent turmoil?
0%

Gain Two Extra VR-Controlled Robotic Arms with this Backpack

Aug 30th, 2018 11:00am by
Featued image for: Gain Two Extra VR-Controlled Robotic Arms with this Backpack

We take the convenience of having two arms and two hands mostly for granted, but recent developments in robotics are showing that even the able-bodied could do a lot with an extra programmable arm (or even a finger, for that matter).

Looking to augment the human body beyond its given limitations, Yamen Saraiji, an assistant professor at Keio University Graduate School of Media Design, has created Fusion, a set of robotic arms that can be controlled by another person through virtual reality. Watch to see how these extra appendages work:

Done in collaboration with the University of Tokyo, the prototype features a wearable, 21-pound system that is carried like a backpack, and includes mechanical arms and a machine facsimile of hands and fingers. In addition to a battery that has enough power for about an hour and a half, the pack includes a small computer that wirelessly transmits data between the user and the remote operator, as well as a camera.

It’s through the camera that a second person, acting as the remote operator in another location, can “see” whatever the first user sees in real-time via an Oculus Rift VR headset. Thanks to sensors in the headset, when the remote operator moves their head in virtual reality, the camera moves accordingly in response, giving the operator a ringside seat in the middle of the action, so to speak. To remotely control the backpack’s robotic arms, the operator uses Oculus’ Touch controllers, which relays the movements to the micro-controller that’s connected to the backpack’s computer.

The aim is to enable what Saraiji and his team call “full body sharing,” where the remote operator can “dive into someone’s else body” using a virtual connection. Since these arms can move independently of the person wearing them, the robotic limbs become an extension of not only of the “surrogate body” that they are mounted on, but also of the faraway person that controls them.

Alternatively, the mechanical hands can be removed and replaced with straps that can be attached to the host’s wrists, so that the remote operator can actually move the host person’s arms for them.

That may sound a bit creepy to some of us, but there are a lot of potential uses for such a system, such as using it collaboratively to complete tasks or having one person instruct the other in a hands-on way to learn new skills. The main idea here is to allow a more fluid exchange and smoother interactions between two entities, explained Saraiji: “Effective communication is a key factor in social and professional contexts which involve sharing the skills and actions of more than one person. [..] We demonstrate through this system the possibilities of truly embodying and transferring our body actions from one person to another, realizing true body communication.”

 

The project is an interesting twist on the convenience of telepresence technologies, which allow people to “see” through a robot’s eyes, without actually having to be there in person. But there’s the element of human augmentation here, and as Saraiji outlines, there are three levels of “bodily driven communication” with this system: directed, enforced and induced. The mildest flavor is the ‘directed’ type of bodily communication, where the humanoid hands are assisting or teaching the surrogate host a new skill. “Enforced” communication happens when the robotic arms are strapped directly onto the host’s arms, permitting the remote operator to exert more physical control over the movement of the surrogate’s limbs. Finally, “induced” communication occurs when the remote operator uses more force to control the host’s arms — possibly jerking them around.

It’s not the first time that Saraiji has experimented with human augmentation; in fact, Fusion represents a significant upgrade from his previous project, MetaLimbs, which featured a robotic “third arm” controlled by foot pedals. But with Fusion, the fact that it can be virtually controlled by another user opens up many more possibilities: one can imagine having a distant expert use the arms to help the wearer tackle some difficult task, like fixing a computer, aiding in a medical emergency, or facilitating a session of remotely assisted physical therapy. The team now hopes to find ways to commercially implement Fusion; to find out more, visit Yamen Saraiji.

Images: Keio University & University of Tokyo.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.