March 4, 2025
Video: The UW’s assistive-feeding robot gets tested outside the lab
The mechanics of eating are more complex than they might appear. For about a decade, researchers in the Personal Robotics Lab at the University of Washington have been working to build a robot that can help feed people who can’t eat on their own.
Researchers’ first breakthrough, back when the lab was at Carnegie Mellon University, was getting a robotic arm to use a fork to feed someone a marshmallow. Since then, the robot graduated from feeding users fruit salads to full meals composed of nearly anything that can be picked up with a fork. Researchers also investigated how the robot can enhance the social aspects of dining.
Up until recently, this work has mostly been evaluated in the lab. But last year, researchers deployed the assistive feeding arm in a pair of studies outside the lab. In the first, six users with motor impairments used the robot to feed themselves a meal in a UW cafeteria, an office or a conference room. In the second study, one of those users, Jonathan Ko, a community researcher and co-author on the research, used the system at home for five days, having it feed him ten meals.
The team will present its research March 5 at the ACM/IEEE International Conference on Human-Robot Interaction in Melbourne.
“Our past studies have been in the lab because, if you want to evaluate specific system components in isolation, you need to control all other aspects of the meal,” said lead author Amal Nanavati, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “But that doesn’t capture the diverse meal contexts that exist outside the lab. At the end of the day, the goal is to enable people to feed themselves in real environments, so we should also evaluate the system in those environments.”
The system, which researchers dubbed ADA (Assistive Dexterous Arm), consists of a robotic arm that can be affixed to something nearby, such as a power wheelchair or hospital table. The user specifies what bite they want through a web app, and the system then feeds the user that bite autonomously (though users can stop the arm with a “kill button”). The arm has a force sensor and camera to distinguish between foods and to get the food to the user’s mouth.
In both studies, users successfully fed themselves their meals. In the first study, the robot acquired entrees with around 80% accuracy, which users in another study found to be the threshold for success. In the second study, the home’s varied circumstances and environments — Ko could be eating while watching TV in low light or while working in bed — hindered the system’s default functionality. But researchers designed the system to be customizable, so Ko was able to control the robot and still feed himself all meals.
The team plans to continue improving the system for effectiveness and customizability.
“It was a really important step to take the robot out of the lab,” Ko said. “You eat in different environments, and there are little variables that you don’t think about. If the robot is too heavy, it might tilt a table. Or if the lighting isn’t good, the facial recognition could struggle, but lighting is something you really don’t think about when you’re eating.”
Additional co-authors include Ethan K. Gordon, Bernie Hao Zhu, and Rosario Scalise, all doctoral students in the Allen School; Taylor A. Kessler Faulkner, a lecturer in the Allen School; Tyler Schrenk, the late president of the Tyler Schrenk Foundation and a community researcher; Vy Nguyen, an occupational therapy clinical research lead at Hello Robot; Haya Bolotski, a high school intern in the Allen School while completing this research; Yuxin (Ray) Song, Atharva Kashyap, Sriram Kutty, Liander Rainbolt, and Ramon Qu, all undergraduate students in the Allen School while doing this research; Raida Karim, a research scientist assistant in the Allen School while completing this research; and Hanjun Song, a research scientist in the Allen School while completing this research. Maya Cakmak and Siddhartha S. Srinivasa, both professors in the Allen School, are senior authors. This research was funded in part by a UW CREATE student grant; UW Allen School Postdoc Research award; the National Science Foundation; the DARPA RACER program; the National Institute of Biomedical Imaging and Bioengineering; and the Office of Naval Research.
For more information, contact Nanavati at amaln@cs.washington.edu.
Tag(s): Amal Nanavati • College of Engineering • Paul G. Allen School of Computer Science & Engineering