Data from: Humans can use positive and negative spectrotemporal correlations to detect rising and falling pitch
Data files
Oct 28, 2025 version files 3.91 GB
-
binauralExperiment.zip
620.16 MB
-
coherenceExperiment.zip
369.18 MB
-
correlatedPipsExperiment.zip
485.44 MB
-
fmriExperiment.zip
564.54 KB
-
generalToneGeneration.zip
8.02 KB
-
glidersExperiment.zip
248.63 MB
-
pipDeltaNoteExperiment.zip
358.81 MB
-
pipDeltaTimeExperiment.zip
392.15 MB
-
pipDeltaTimeSuppExperiment.zip
449.13 MB
-
README.md
4.32 KB
-
soundAnalysisCode.zip
168.81 KB
-
ternaryExperiment.zip
983.19 MB
Abstract
To discern speech or appreciate music, the human auditory system detects how pitch changes over time. However, the algorithms used to detect changes in pitch, or pitch motion, are incompletely understood. Here, using psychophysics, computational modeling, functional neuroimaging, and analysis of recorded speech, we ask if humans can detect pitch motion using computations analogous to those used by the visual system. We adapted stimuli from studies of vision to create novel auditory correlated noise stimuli that elicited robust pitch motion percepts. Crucially, these stimuli are inharmonic and possess no long-range features across frequency or time, but do possess positive or negative local spectrotemporal correlations in intensity. In psychophysical experiments, we discovered that humans can judge pitch direction based only on positive or negative spectrotemporal intensity correlations. The key behavioral result—robust sensitivity to the negative spectrotemporal correlations—is a direct analogue of illusory “reverse-phi” motion in vision, and thus constitutes a new auditory illusion. Our behavioral results and computational modeling led us to hypothesize that human auditory processing may employ pitch direction opponency. fMRI measurements in auditory cortex supported this hypothesis. To link our psychophysical findings to real-world pitch perception, we analyzed recordings of English and Mandarin speech and found that pitch direction was robustly signaled by both positive and negative spectrotemporal correlations, suggesting that sensitivity to both types of correlations confers ecological benefits. Overall, this work reveals how motion detection algorithms sensitive to local correlations are deployed by the central nervous system across disparate modalities (vision and audition) and dimensions (space and frequency).
Code and data for generating stimuli and conducting analyses related to human auditory correlation sensitivity.
Each folder corresponds to an experiment displayed in corresponding figures:
ternaryExperiment.zip: contains the psychophysical script (ternary.m), tone generating function (ternaryFunction.m), and analysis code (analyzeTernaryExperiment.m) to generate Figure 1c.
coherenceExperiment.zip: contains the psychophysical script (coherence.m), tone generating function (coherenceFunction.m), and analysis code (analyzeCoherenceExperiment.m) to generate Figure 1d.
binauralExperiment.zip: contains the psychophysical script (binaural.m), tone generating function (binauralFunction.m), and analysis code (analyzeBinauralExperiment.m) to generate Figure 1f.
pipDeltaTimeExperiment.zip: contains the psychophysical script (pipDeltaTime.m), tone generating function (pipDeltaTimeFunction.m), and analysis code (analyzePipDeltaTimeExperiment.m) to generate Figure 2c.
pipDeltaTimeSuppExperiment.zip: contains the psychophysical script (pipDeltaTimeSupp.m), tone generating function (pipDeltaTimeSuppFunction.m), and analysis code (analyzePipDeltaTimeSuppExperiment.m) to generate Figure S2a, b.
pipDeltaNoteExperiment.zip: contains the psychophysical script (pipDeltaNote.m), tone generating function (pipDeltaNoteFunction.m), and analysis code (analyzePipDeltaNoteExperiment.m) to generate Figure 2d.
correlatedPipsExperiment.zip: contains the psychophysical script (correlatedPips.m), tone generating function (correlatedPipsFunction.m), and analysis code (analyzeCorrelatedPipsExperiment.m) to generate Figure 2b, c.
glidersExperiment.zip: contains the psychophysical script (gliders.m), tone generating function (glidersFunction.m), and analysis code (analyzeGlidersExperiment.m) to generate Figure S3c, d.
soundAnalysisCode.zip: contains the script analyzeCorrelationsFunctionsFigure_final.m and audio clip cummings_trimmed.m4a to generate Figures 5 and S5.
generalToneGeneration.zip: contains the tone generating function generalToneGenerationFunction.m for creating and listening to the tone types that were used in these experiments.
fmriExperiment.zip: contains the code and NIFTI files to generate Figures 4h, i, j and S4g, h, i.
Variables:
Each .mat file outputs several variables that were saved during the corresponding experiments, with little variation between experiments. Most importantly, [participant ID].mat gives a struct with a given participant's responses, and [participant ID]_toneData.mat gives a struct with sufficient information to recreate each tone that the participant was presented with. Within a given participant's response struct, the variables of particular interest across experiments are the tone direction (displacement) and correlation (corrparity).
Specifications:
All psychophysical task code was run on a MacBook Pro with an Intel chip using Matlab 2021b and Psychtoolbox 3.0.18. All fMRI analyses were conducted in JupyterLab.
Directions:
- Ensure that the above specifications are met.
- To participate in a task, download the psychophysical script and corresponding tone generating function. Make sure that
iso226.mand the corresponding data folder are also in the working folder. Run the psychophysical script, and enter a subject name of choice. Follow the task instructions until it concludes. - To analyze psychophysical data from the original experiments, download the corresponding data folder located inside of each experiment folder, which will contain the
.matfiles for all the participants for a particular experiment. Make sure that the analysis script is in the same folder as the data, and run the script.
Code access:
The code used to produce stimuli, analyze data, and generate figures is available on the humanAuditoryCorrelations GitHub repository.
Human subjects data
All participant data was anonymized prior to sharing. Prior to the study, all participants provided informed consent for the sharing of fully anonymized data.
