MIRROR ‘s researchers have studied how robots can learn to communicate by imitating humans’ gestures. They first observed how infants and monkeys learn complex acts such as grasping and transfer it to robots.
The “mirror neurons” are being activated when monkeys or kids see a grasping action performed by someone else. These neurons behave as a motor resonant system activated both during goal-directed actions and the observation of similar actions performed by others.
The scientists made experiments with monkeys and infants to see how visual and motor information can be used to learn to discriminate grasping actions. They then used that information to show how, by detecting visual clues to the function of an object, a robot can mimic simple object-directed actions.
Finally, they integrated the developed work into a humanoid robot, made of a binocular head, an arm, and a multi-fingered hand. Although the integration is not complete yet, they have uncovered many elements of a biologically-compatible architecture that can be replicated in robots.
The follow-up project, called RobotCub, focuses on building a humanoid platform and studying the development of manipulation skills.
Via PhysOrg.