Data from: Fine-grained neural coding of bodies and body parts in human visual cortex
Data files
Nov 21, 2024 version files 6.39 GB
-
DataPNAS.zip
6.39 GB
-
README.md
4.09 KB
Abstract
Body perception plays a fundamental role in social cognition. Yet, the neural mechanisms underlying this process in humans remain elusive given the spatiotemporal constraints of functional imaging. Here we present for the first time intracortical recordings of single- and multi-unit spiking activity in two epilepsy surgery patients in or near the extrastriate body area (EBA), a critical region for body perception. Our recordings revealed a strong preference for human bodies over a large range of control stimuli. Notably, body selectivity was driven by a distinct selectivity for body parts. The observed body selectivity generalized to non-photographic depictions of bodies including silhouettes and stick figures. Overall, our study provides unique neural data that bridge the gap between human neuroimaging and macaque electrophysiology studies, laying a solid foundation for computational models of human body processing.
This repository contains the supplementary data and source code used in our study. The materials are organized into two main sections: Data and Source Code. Below, you will find detailed information on how the data is structured and how to utilize the provided source code for analysis.
Table of Contents
Data
Directory Structure
The data is organized into folders corresponding to different experiments. Each experiment folder contains subfolders for each patient (P1 and P2). Below is the hierarchical structure:
├── AbstractBodies
│ ├── PatientP1
│ │ └── Session1_MUA.pkl
│ └── PatientP2
│ └── Session1_MUA.pkl
├── BodyParts
│ ├── PatientP1
│ │ ├── Session1_MUA.pkl
│ │ └── Session1_SUA.pkl
│ └── PatientP2
│ ├── Session1_P1_MUA.pkl
│ ├── Session1_P2_MUA.pkl
│ └── Session1_SUA.pkl
├── Categorbodies
│ ├── PatientP1
│ │ ├── Session1_MUA.pkl
│ │ ├── Session_Silhouettes_P1_MUA.pkl
│ │ └── Session_Silhouettes_P2_MUA.pkl
│ └── PatientP2
│ ├── Session1_P1_MUA.pkl
│ └── Session1_P2_MUA.pkl
├── EBALocalizer
│ └── PatientP2
│ └── Session1_P2_MUA.pkl
└── RotationTolerance
├── PatientP1
│ └── Session1_MUA.pkl
└── PatientP2
└── Session1_MUA.pkl
File Naming Convention
Each data file follows the naming pattern:
SessionX_PY_
- SessionX: Corresponds to the recording session number.
- PY: Indicates the part of the session. Used for large files.
- Resolution: Specifies the data resolution, either
MUA
(Multi-Unit Activity) orSUA
(Single-Unit Activity).
Example: Session1_P1_MUA.pkl
Data Format
All data files are in .pkl
format and should be opened using Python. Each .pkl
file contains a trials
structure with entries corresponding to recorded trials.
Data Fields
Within each trial, the following keys are relevant for analysis:
- answer:
1
if the trial was correct.0
otherwise.
- MUAA:
- A dictionary with keys formatted as
elecX
, whereX
represents the recording electrode number from the Utah array (total of 96 electrodes). - Values are NumPy arrays containing spike times for each electrode.
- A dictionary with keys formatted as
- current_stimul_name:
- The name of the stimulus presented during the trial.
Source Code
Overview
The source code provided is used to analyze the dataset. It is organized within the Code
folder, which serves as the entry point for various analysis scripts. The analysis is conducted at different levels, including:
- Single Electrodes
- Population Analysis
- Decoding
- V1 Modeling
Entry Point
The primary entry point for the source code is the Code
folder. This folder contains scripts tailored to analyze data at multiple levels.
Data Preprocessing
To understand how the data is preprocessed, refer to the analysis_basic.py
script of the neurovisual package. This script outlines the fundamental preprocessing steps applied to the raw data before analysis together with single recording site analysis.
Additional Modules
Beyond the basic preprocessing, the source code includes extra modules that perform:
- Population analysis: Multi-electrode interactions and patterns
- Decoding analysis: Neural decoding and classification
- V1 modeling: Primary visual cortex response modeling
All data were collected using 96-channel utah arrays implanted in human visual cortex, and recorded with a Neuroport system (Blackrock Neurotech).