
Photo caption: The UW team used I-LABS research on how babies follow an adult’s gaze to “teach” a robot to perform the same task. Credit: University of Washington.
A collaboration between I-LABS and University of Washington computer scientists has demonstrated that robots can “learn” much like kids — by amassing data through exploration, watching a human do something and determining how to perform that task on its own.
“You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans,” said senior author Rajesh Rao, a UW professor of computer science and engineering.
“If you want people who don’t know anything about computer programming to be able to teach a robot, the way to do it is through demonstration — showing the robot how to clean your dishes, fold your clothes, or do household chores. But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own.”
The research was published in a paper in November in the journal PLOS ONE.
“Babies engage in what looks like mindless play, but this enables future learning. It’s a baby’s secret sauce for innovation,” said co-author Andrew Meltzoff, co-director of I-LABS. “If they’re trying to figure out how to work a new toy, they’re actually using knowledge they gained by playing with other toys. During play they’re learning a mental model of how their actions cause changes in the world. And once you have that model you can begin to solve novel problems and start to predict someone else’s intentions.”
Read the research paper »
Read the university news release »
Selected Media Coverage
The Atlantic
Smithsonian
Popular Science
NSF’s Science360
Quartz
R&D Magazine
Product Design & Development
SEAPOWER magazine (Navy League of the United States)