Perception of audiovisual speech synchrony for native and non-native language

TitlePerception of audiovisual speech synchrony for native and non-native language
Publication TypeJournal Article
Year of Publication2010
AuthorsNavarra J, Alsius A, Velasco I, Soto-Faraco S, Spence C
JournalBrain Research
Date Published04/2010
Keywordsaudiovisual integration, native language, non-native language, prior experience, Speech Perception, synchrony perception

To what extent does our prior experience with the correspondence between audiovisual stimuli influence how we subsequently bind them? We addressed this question by testing English and Spanish speakers (having little prior experience of Spanish and English, respectively) on a crossmodal simultaneity judgment (SJ) task with English or Spanish spoken sentences. The results revealed that the visual speech stream had to lead the auditory speech stream by a significantly larger interval in the participants' native language than in the non-native language for simultaneity to be perceived. Critically, the difference in temporal processing between perceiving native vs. non-native language tends to disappear as the amount of experience with the non-native language increases. We propose that this modulation of multisensory temporal processing as a function of prior experience is a consequence of the constraining role that visual information plays in the temporal alignment of audiovisual speech signals.