Data from: The visual speech head start improves perception and reduces superior temporal cortex responses to auditory speech
Data files
Aug 15, 2019 version files 66.12 MB
-
analyze_ieeg_congruency.R
13.57 KB
-
Batch_3266022_batch_results.csv
984.12 KB
-
Batch_3270163_batch_results.csv
1.04 MB
-
behavior_exp1.R
2.37 KB
-
behavior_exp2.R
2.40 KB
-
behavior_functions.R
6.53 KB
-
behavior_power_analysis.R
1.52 KB
-
congruency_data_set_by_electrode.csv
64.06 MB
-
ieeg_analysis_functions.R
5.97 KB
-
ieeg_summary_stats.csv
6.41 KB
Abstract
Visual information about speech content from the talker's mouth is often available before auditory information from the talker's voice. Previously we demonstrated that audiovisual speech selectively enhances activity in regions of the early visual cortex representing the mouth of the talker (Ozker et al., 2018b). Here we examined perceptual and neural responses to words with and without this visual head start. For both types of words, perception was enhanced by viewing the talker's face, but the enhancement was significantly greater for words with a head start. Neural responses were measured from electrodes implanted over auditory association cortex in the posterior superior temporal gyrus (pSTG) of epileptic patients. The presence of visual speech suppressed responses to auditory speech, more so for words with a visual head start. We suggest that the head start inhibits representations of incompatible auditory phonemes, increasing perceptual accuracy and decreasing total neural responses. Together with previous work showing visual cortex modulation (Ozker et al., 2018b) these results from pSTG demonstrate that multisensory interactions are a powerful modulator of activity throughout the speech perception network.