More details can be found here: http://www.gipsa-lab.grenoble-inp.fr/~frederic.elisei/RoboTrio/
The corpus involves a collaborative game, played simultaneously by two humans. They sit in front of a social robot that plays as a game animator and referee. This robot is teleoperated by a human pilot: Gaze for the robot, eye vergence, head orientation, lip and jaw articulations, speech are captured on the human pilot in real-time and drive the robot. In this immersive teleoperation setup, the pilot sees through the robot cameras (stereo) and perceives audio through the ears of the robot, leading to a high level of embodiment. What is demonstrated by the pilot is a viable solution for the robot sensors/actuators to conduct a natural interaction with humans and successfully perform the intended task (social interaction with gaze and speech turnovers in a gaming scenario). Data streams and events that link to the perception as well as to the action are logged. These were primarily intended to build autonomous behaviour models for a social robot (Nina robot, a modified iCub).
23 experiments were recorded (around 20 minutes each). The pilot is always the same, the human players are always different. The two players in an experiment are either both male or both female.