April 5, 2012
Children perceive humanoid robot as emotional, moral being
Learn more, watch videos at Kahn’s lab website.
Robot nannies could diminish child care worries for parents of young children. Equipped with alarms and monitoring capabilities to guard children from harm, a robot nanny would let parents leave youngsters at home without a babysitter.
Sign us up, parents might say.
Human-like robot babysitters are in the works, but it’s unclear at this early stage what children’s relationships with these humanoids will be like and what dangers lurk in this convenient-sounding technology.
Will the robots do more than keep children safe and entertained? Will they be capable of fostering social interactions, emotional attachment, intellectual growth and other cognitive aspects of human existence? Will children treat these caregivers as personified entities, or like servants or tools that can be bought and sold, misused or ignored?
“We need to talk about how to best design social robots for children, because corporations will go with the design that makes the most money, not necessarily whats best for children,” said Peter Kahn, associate professor of psychology at the University of Washington. “In developing robot nannies, we should be concerned with how we might be dumbing down relationships and stunting the emotional and intellectual growth of children.”
To guide robot design, Kahn and his research team are exploring how children interact socially with a humanoid robot. In a new study, the researchers report that children exchanged social pleasantries, such as shaking hands, hugging and making small talk, with a remotely controlled human-like robot (Robovie) that appeared autonomous. Nearly 80 percent of the children – an even mix of 90 boys and girls, aged 9, 12 or 15 – believed that the robot was intelligent, and 60 percent believed it had feelings.
The journal Developmental Psychology published the findings in its March issue.
The children also played a game of “I Spy” with Robovie, allowing the researchers to test what morality children attribute to the robot. The game started with the children guessing an object in the room chosen by Robovie, who then got a turn to guess an object chosen by the child.
But the humanoid robots turn was cut short when a researcher interrupted to say it was time for the interview part of the experiment and told Robovie that it had to go into a storage closet. Via a hidden experimenter’s commands, Robovie protested, and said that it wasn’t fair to end the game early. “I wasn’t given enough chances to guess the object,” the robot argued, going on to say that its feelings were hurt and that the closet was dark and scary.
When interviewed by the researchers, 88 percent of the children thought the robot was treated unfairly in not having a chance to take its turn, and 54 percent thought that it was not right to put it in the closet. A little more than half said that they would go to Robovie for emotional support or to share secrets.
But they were less agreeable about allowing Robovie civil liberties, like being paid for work. The children also said that the robot could be bought, sold and should not have the right to vote.
The findings show that the social interactions with Robovie led children to develop feelings for the robot and attribute some moral standing to it. This suggests that the interactions used in the study represent aspects of human experience that could be used for designing robots.
The researchers added that robot nanny design should also factor in how agreeable a robot should be with a child. Should a robot be programmed to give in to all the child’s desires, play whatever game is demanded? Or should it push back, like Robovie did when the I Spy game ended early?
Kahn believes that as social robots become pervasive in our everyday lives, they can benefit children but also potentially impoverish their emotional and social development.
The National Science Foundation funded the study. Co-authors at UW are Nathan Freier, Rachel Severson, Jolina Ruckert and Solace Shen. Other co-authors are Brian Gill of Seattle Pacific University, and Hiroshi Ishiguro and Takayuki Kanda, both of Advanced Telecommunications Research Institute, which created Robovie.
###
For more information, contact Kahn at pkahn@uw.edu; or Shen at 206-221-0643 or solaces@uw.edu.