Learning, generating and evaluating socio-communicative behaviors of a humanoid robot for human-robot interaction
A socially assistive robot (SAR) is meant to engage people into situated interaction such as monitoring physical exercise, neuropsychological rehabilitation or cognitive training. While the interactive behavioral policies of such systems are mainly hand-scripted, we discuss here key features of the training of multimodal interactive behaviors by demonstration we developed in the framework of the SOMBRERO project. Immersive teleoperation of a SAR by professional caregivers enables these human pilots to teach how to monitor the interactive tasks while actually performing them, i.e. giving proper instructions, demonstrations, and feedbacks. We describe here how to make a multimodal interactive behavioral model; and how to build gesture controllers to execute events generated from this model to drive the speech, hand gestures, eyes of our iCub robot. We also proposed a framework to evaluate online the multimodal interactive of our SAR. We show that the evaluation framework allows to detect and reduce the robot’s faulty behaviors.
