📚 Volume 31, Issue 9 📋 ID: cp72EHT

Authors

Philippe Savchenko , John Yamamoto

Faculty of Medicine, Mahasarakham University, Thailand

Abstract

Learning the correspondences between letters and speech sounds of a language is a crucial step in reading acquisition. The perception of speech and the inherently linked lip movements (audiovisual speech) emerged simultaneously during evolution, shaping the brain for integrating this audiovisual information. This study was thus conducted EEG experiments to characterize the influence of visual orthography on the most robust auditory event-related potentials (ERPs) and focused the analysis on systematic variation of the auditory ERP as a function of visual orthography information. The subjects received auditory, visual, and audiovisual letters and were required to identify them, regardless of stimulus modality. Audiovisual letters included matching letters, in which the auditory and visual stimulus corresponded to each other based on previous experience, and nonmatching (randomly paired) letters. Meaningless auditory, visual, and audiovisual control stimuli were presented. The results showed that both non-phonetic and phonetic audiovisual interactions were found in the ERPs similar to the AV stimuli. The differences in the sum of the ERPs to the unimodal A and V stimuli and in ERPs to AV stimuli indicated interactions presumably based on temporal of the A and V components of the AV stimuli. The differences in the ERPs to the meaningful and meaningless of AV stimuli probably reflect multisensory interactions in phonetic processing.
🔐

Login Required

Please login to access the full PDF of this article.

🔑 Author Login

📝 How to Cite

Philippe Savchenko , John Yamamoto (2024). "Audiovisual Integration of Unfamiliar Words Reflects Multisensory Interaction in Phonetic Perception". Wulfenia, 31(9).