Speech production is substantially dependent on both auditory and somatosensory information. The two occur in combination from the earliest vocalizations produced by an infant and both are involved in speech motor learning. It is hence important to know how the somatosensory system functions in speech processing as well as the auditory system. The major themes of my research focused on the following topics: (1) behavioral and neural correlates to somatosensory function in speech perception, and (2) somatosensory-based speech motor learning probed by studying second language acquisition. A novel orofacial skin stretch perturbation paradigm has been used in a series of studies to probe the orofacial somatosensory system while excluding involvement of the motor system. As a consequence of this work I hope to contribute to improved understanding of speech perceptual processing and motor learning, particularly as they affect language acquisition.
Speech motor control stabilises articulatory system for efficient speech production, which aids smooth communication in daily life. While auditory and somatosensory feedback contribute to speech motor control, it is still unknown how these sensory inputs interact to ensure the required stability. Our hypotheses are 1) that somatosensory feedback plays a predominant role in quick on-line compensation due to shorter response latency than auditory feedback, and 2) that the somatosensory-based stability mechanism is acquired during speech development to achieve speech-relevant auditory goals. Our project will provide clues as to the mechanisms that enable speech production stability using sensory feedback.
Grenoble Images Parole Signal Automatique laboratoire
UMR 5216 CNRS - Grenoble INP - Université Joseph Fourier - Université Stendhal