VILAIN
Coriandre
Research Engineer, STENDHAL University
Research

Multimodal Speech Perception


Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker׳s face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.

Treille, A., Coeurdeboeuf ,C., Vilain, C.& Sato, M. (2014). Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions. Neuropsychologia, 2014 (In press)

Speech/Gesture interaction


Our research explores the possible encoding of distance information in vocal and manual pointing and its relationship with the linguistic structure of deictic words, as well as speech/gesture cooperation within the process of deixis. Two experiments required participants to point at and/or name a close or distant target, with speech only, with gesture only, or with speech + gesture. Acoustic, articulatory, and manual data were recorded. We investigated the interaction between vocal and manual pointing, with respect to the distance to the target. There are two major findings. First, distance significantly affects both articulatory and manual pointing, since participants perform larger vocal and manual gestures to designate a more distant target. Second, modality influences both deictic speech and gesture, since pointing is more emphatic in unimodal use of either over bimodal use of both, to compensate for the loss of the other mode. These findings suggest that distance is encoded in both vocal and manual pointing. We also demonstrate that the correlates of distance encoding in the vocal modality can be related to the typology of deictic words. Finally, our data suggest a two-way interaction between speech and gesture, and support the hypothesis that these two modalities are cooperating within a single communication system.

Gonseth, C., Vilain, A., Vilain, C. (2013). An experimental study of speech/gesture interactions and distance encoding. Speech Communication, 55(4), 553-571.

Grenoble Images Parole Signal Automatique laboratoire

UMR 5216 CNRS - Grenoble INP - Université Joseph Fourier - Université Stendhal