Skip to main content
Dryad

Repeatedly experiencing the McGurk effect induces long-lasting changes in auditory speech perception

Cite this dataset

Magnotti, John; Beauchamp, Michael (2024). Repeatedly experiencing the McGurk effect induces long-lasting changes in auditory speech perception [Dataset]. Dryad. https://doi.org/10.5061/dryad.4f4qrfjkw

Abstract

In the McGurk effect, presentation of incongruent auditory and visual speech evokes a fusion percept different than either component modality. We show that repeatedly experiencing the McGurk effect for 14 days induces a change in auditory-only speech perception: the auditory component of the McGurk stimulus begins to evoke the fusion percept, even when presented on its own without accompanying visual speech. This perceptual change, termed fusion-induced recalibration (FIR), was talker-specific and syllable-specific and persisted for a year or more in some participants without any additional McGurk exposure. Participants who did not experience the McGurk effect did not experience FIR, showing that recalibration was driven by multisensory prediction error. A causal inference model of speech perception incorporating multisensory cue conflict accurately predicted individual differences in FIR. Just as the McGurk effect demonstrates that visual speech can alter the perception of auditory speech, FIR shows that these alterations can persist for months or years. The ability to induce seemingly permanent changes in auditory speech perception will be useful for studying plasticity in brain networks for language and may provide new strategies for improving language learning.

README: Repeatedly experiencing the McGurk effect induces long-lasting changes in auditory speech perception

These data are individual participant reports during presentation of speech stimuli.

Description of the data and file structure

The data is contained in the file SummaryData.xlsx
There are two worksheets.

The "batch1" sheet contains data from the main experiment. The "mturk" sheet contains the data from the replication study.
In each sheet, one row corresponds to the responses to a particular stimulus on a particular data, aggregated across repeated presentations of that stimulus (if applicable).

In both sheets, the first column is the participant ID (one ID per participant) and the second column is the name of the stimulus file.
the "day_type" column indicates pre-test, training, post-test, or long-term post-test.
the "time" column indicates the experimental day.
"stimulusPresentation" is A for auditory-only, AV for audiovisual.
"stimulusType" refers to auditory and visual congruency (congruent or McGurk)
"n_responses" is the total number of responses for that stimulus on that data.
"resp_X" is the number of each type of response, where A is auditory, F is fusion, O is other, V is visual
"stimulusPaperName" is identical to the stimulus name except for some stimuli that are referred to in the paper with an abbreviation (e.g. S1)
"training" refers to whether a stimulus was presented on training days.

Sharing/Access information

The stimuli presented to the participants are available at https://openwetware.org/wiki/Beauchamp:Stimuli

Code/Software

The R code necessary to analyze the data and produce manuscript results are in the compiled R Markdown file full_results.html

Funding

National Institute of Neurological Disorders and Stroke