50 % CRISSP - 50 % PFP
I'm a permanent CNRS research engineer since December 2004, having my research activities in the CRISSP team. I am also responsible for the development and running of the MICAL experimentation platform, designed to study face-to-face spoken communication between two humans and between a human and a computer, where not only speech, but also eye gaze, gesture, body posture, and facial expression are involved. I'm also responsible of the social robotic plaform, including Nina iCub, used to collect face-to-face interaction data, create interaction models, and evaluate these models in situated interaction scenarios (also using a Furhat robot for that).
In the CRISSP team, we study face-to-face communication: synthesis of audio and facial movements, analysis and synthesis of eye movements, pointing gestures, shared attention... Previous application domains included telecommunication systems for example, hearing-impaired people communication (generating cued speech with a 3D hand), as well as social robotics.
For this, we've been working on the creation, representation and coding of animated, textured 3D face clones, also known as "Talking Heads". We now extended our targets/research domain with a talking robot (modified iCub). We also were interested in the communication in shared-reality environnements, to point real or virtual objects, to ease communication between a real human and a robot or a computer agent, with a special interest for the human eye gazes and their relative phasing with the speech segments and the speech turns.