Skip to main content
Dryad

Data from: Neural tracking of syllabic and phonemic time scale

Data files

Aug 13, 2024 version files 1.69 GB

Abstract

Dynamical theories of speech processing propose that the auditory cortex parses acoustic information in parallel at the syllabic and phonemic time scales. A paradigm was developed to independently manipulate both linguistic time scales, and intracranial recordings were acquired from eleven epileptic patients listening to French sentences. Our results indicate that (i) syllabic and phonemic time scales are both reflected in the acoustic spectral flux; (ii) during comprehension, the auditory cortex tracks the syllabic time scale in the theta range, while neural activity in the alpha-beta range phase locks to the phonemic time scale; (iii) these neural dynamics occur simultaneously and share a joint spatial location; (iv) the spectral flux embeds two time scales —in the theta and low-beta ranges— across 17 natural languages. These findings help understand how the human brain extracts acoustic information from the continuous speech signal at multiple time scales simultaneously, a prerequisite for subsequent linguistic processing.