πŸ“š Volume 31, Issue 2 πŸ“‹ ID: wZBAD9n

Authors

John Ferrari , Aisha Davis

Faculty of Medicine, Mahasarakham University, Thailand

Abstract

This study was conducted EEG experiments to characterize the influence of visual orthography on the most robust auditory event-related potentials (ERPs) and focused the analysis on systematic variation of the auditory ERP as a function of visual orthography information. To study the human brain’s audiovisual integration mechanisms for letters, the subjects received auditory, visual, and audiovisual letters of the Chinese character and were required to identify them, regardless of stimulus modality. Audiovisual letters included matching letters, in which the auditory and visual stimulus corresponded to each other based on previous experience, and nonmatching (randomly paired) letters. Meaningless auditory, visual, and audiovisual control stimuli were presented as well. Fourteen adult, native speakers of Mandarin Chinese were participated in the ERP experiment. The result demonstrates that the audiovisual interaction is an indicator for investigating the automatic processing of suprasegmental information in tonal language. The finding gives support for the view that both sensory-specific and heteromodal cortices are involved in the AV integration of speech. Sensory-specific and heteromodal cortical regions are involved in the AV integration process at separate latencies and are sensitive to different features of AV speech stimuli.
πŸ”

Login Required

Please login to access the full PDF of this article.

πŸ”‘ Author Login

πŸ“ How to Cite

John Ferrari , Aisha Davis (2024). "Tonal Brain Speaker Processing for Audiovisual Integration of Lexical Word Perception". Wulfenia, 31(2).