What news from AWS re:Invent last week will have the most impact on you?
Amazon Q, an AI chatbot for explaining how AWS works.
Super-fast S3 Express storage.
New Graviton 4 processor instances.
Emily Freeman leaving AWS.
I don't use AWS, so none of this will affect me.
Edge Computing

Feeding People with an Intelligent Robot Arm Is Harder Than You Think

Mar 22nd, 2019 6:00am by
Featued image for: Feeding People with an Intelligent Robot Arm Is Harder Than You Think

As an increasing number of people of previous generations get older, there will be many challenges to ensure that our aging and differently-abled populations will retain as much autonomy as possible. In particular, it’s estimated that over 1 million adults in the United States alone already need some kind of assistance to feed themselves. But it’s not as simple as hiring a caregiver to come in on a daily basis to help with feedings: first, that can be quite costly; secondly, for some, it can be a discouraging, daily reminder of how much independence has been lost.

Now here’s an interesting question: would that dismal perception change if a robot helped out with feeding instead? While the use of robotic prosthetics is not new, the task of bringing a piece of food to a waiting mouth is actually more complicated than it seems, especially for a machine. After all, there are a lot of variables: food comes in all kinds of different shapes, sizes and textures, plus food needs to manipulated in the right angle so that it can be delivered into a human mouth.

Aiming to develop a tool that addresses these issues, researchers at the University of Washington have come up with the Assistive Dexterous Arm (ADA), a robotic assistive feeding system that can be attached to a wheelchair. The system comes with a special sensor-equipped fork, and uses a variety of algorithms and components that allow it to deconstruct the complex tasks that are involved with delivering food from fork to mouth. Watch to see how it works:

‘Universe of Food Types’

To develop the system, the researchers focused on breaking down the feeding task into what they called “bite acquisition” (getting the food onto the fork) and “bite transfer” (getting the food from fork into someone’s mouth). Skewering food correctly requires consideration of what kind of properties the food presents. For example, a long, slender, crisp carrot is better speared directly from above, and closer to the end, so that people can take bites away from the fork tines, while softer bananas need to be impaled at an angle to ensure that it won’t slip off the fork.

“If we don’t take into account how easy it is for a person to take a bite, then people might not be able to use our system,” said computer science and engineering professor and paper co-author Siddhartha Srinivasa. “There’s a universe of types of food out there, so our biggest challenge is to develop strategies that can deal with all of them.”

To generate training data for the system, the researchers first enlisted the help of volunteers who were asked to feed a mannequin common foods like bananas, cantaloupes, carrots, celeries, hard-boiled eggs, and strawberries — using a fork with a sensor that could measure how much force they used to pick up the food, and at what angle to manipulate odd-shaped pieces into the mouth.

The robotic arm uses a combination of visual and haptic sensing systems to pick up and transfer food. It relies on two algorithms to perform the feeding process: first using an object-detection algorithm called RetinaNet, which scans and identifies the types of food on a plate by setting up a red bounding box around them in its vision system. The robotic system then uses the SPNet (Skewering Position Network) algorithm to estimate the skewering locations and angle rotations that would lead to the most reliable results for bite acquisition and easiest bite transfer.

Using this customized set-up, the team found that the combination of these food-tailored techniques resulted in the robotic arm performing the same or better than a system that used a homogeneous procedure for different kinds of food. The researchers are now working with other organizations to test out the system by gathering feedback from caregivers and patients in assisted living facilities.

“Ultimately our goal is for our robot to help people have their lunch or dinner on their own,” noted Srinivasa. “But the point is not to replace caregivers: We want to empower them. With a robot to help, the caregiver can set up the plate, and then do something else while the person eats.”

Images: University of Washington

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.