An fNIRS dataset for multimodal speech comprehension in normal hearing individuals and cochlear implant users
Data files
Jul 10, 2025 version files 724.79 MB
-
ds_helpers.py
22.26 KB
-
ds_main.ipynb
4.02 MB
-
environment.yml
3.03 KB
-
README.md
3.95 KB
-
snirf_local-bids.zip
720.74 MB
-
supplementary_code.zip
1.31 KB
Abstract
Understanding cortical processing in cochlear implant (CI) users is crucial for improving speech rehabilitation outcomes. Functional near-infrared spectroscopy (fNIRS) provides a non-invasive, implant-compatible method for assessing cortical activity during speech comprehension. This is a multimodal fNIRS dataset collected from 46 CI users and 26 normal hearing (NH) controls. Participants completed a clinically relevant speech comprehension task using the German Matrix Sentence Test (OLSA) under speech-in-quiet, speech-in-noise, audiovisual speech, and visual speech (i.e., lipreading) conditions. fNIRS recordings covered key cortical regions involved in speech processing, including the prefrontal, auditory, and visual cortices. In addition to fNIRS data, we provide detailed metadata, including patient history, hearing test results, behavioral measures, and spatially registered probe positions. The dataset represents a comprehensive fNIRS resource for investigating multimodal speech understanding in CI users.
Dataset DOI: 10.5061/dryad.crjdfn3g9
Description of the data and file structure
The dataset is organized according to the Brain Imaging Data Structure v1.10.0 (BIDS) with extensions for NIRS data [1,2].
The main data files are in Shared Near Infrared Spectroscopy Format (SNIRF), stored in (snirf_local-bids.zip). Each participant has an individual SNIRF file, and also fields from the BIDS specification. The BIDS fields are replicated from the SNIRF file, which allows relevant behavioral, event, or channel-related data to be parsed without using an SNIRF reader.
Participant-level information is stored in the root directory of the dataset (participants.tsv) file, along with an accompanying sidecar (participants.json) that provides descriptions of each column. These are top-level files summarizing relevant demographic and audiometric variables, enabling researchers to filter participant groups efficiently. To complement this, a more detailed Excel file (participants_info.xlsx) includes all metadata, with accompanying code provided for MATLAB (in supplementary_code folder).
For Python, we provide a demonstration pipeline that includes loading, pre-processing, and visualization, implemented in a Jupyter Notebook (ds_main.py) and a supporting Python script (ds_helpers.py) containing necessary helper functions. To run the demonstration, all dependencies need to be installed (listed in environment.yml file.
Files
File: snirf_local-bids.zip
Description: Data files are organized according to the BIDS-NIRS format. Participants with cochlear implants have a label starting with "S", and normal hearing participants with "C".
dataset_description.json: Information on authorship, funding, and BIDS version
participants.tsv: Demographic and audiometric variables for quick filtering of the dataset
participants.json: Describing the columns of participants.tsv
participants_info.xlsx: All metadata summarized in Excel format
subfolders:
sub-<label>/
- nirs/
- sub-<label>_coordsystem.json: Location of optodes
- sub-<label>_optodes.tsv: Location of anatomical landmarks, and the coordinate system and units in which the position of optodes and landmarks is expressed
- sub-<label>task-full_channels.tsv: Detailed information on the pairing of source and detector optodes with a specific wavelength of light
- sub-<label>task-full_events.tsv: Triggers during the experiment
- sub-<label>task-full_nirs.json: Detailed information on the fNIRS instrument and task
- sub-<label>task-full_nirs.snirf: Main data file
File: ds_main.ipynb, ds_helpers.py and environment.yml
Description: A runnable Jupyter notebook for demonstration, helper functions, and Python dependencies
File: supplementary_code.zip
Description: Supplementary code for MATLAB users to explore metadata
References
[1] Brain Imaging Data Structure Specifications for NIRS.
https://bids-specification.readthedocs.io/en/stable/modality-specific-files/near-infrared-spectroscopy.html
Accessed: 2025-01-31
[2] Brain Imaging Data Structure Specifications for Modality agnostic files. https://bids-specification.readthedocs.io/en/stable/modality-agnostic-files.html
Accessed: 2025-06-18
Human subjects data
This study adhered to the Declaration of Helsinki and received approval from the local institutional review board (KEK-Bern, BASED-ID 2020-02978). All participants provided written informed consent before their participation. The data does not include information that can be used to identify an individual.
