Data from: Remote activation of place codes by gaze in a highly visual animal
Abstract
Vision enables many animals to perform spatial reasoning from remote locations. By viewing distant landmarks, animals recall spatial memories and plan future trajectories. Although these spatial functions depend on hippocampal place cells, the relationship between place cells and active visual behavior is unknown. Here, we studied a highly visual animal, the chickadee, in a behavior that required alternating between remote visual search and spatial navigation. We leveraged the head-directed nature of avian vision to track gaze in freely moving animals. We discovered a profound link between place coding and gaze. Place cells activated not only when the chickadee was in a specific location, but also when it simply gazed at that location from a distance. Gaze coding was precisely timed by fast, ballistic head movements called “head saccades”. On each saccadic cycle, the hippocampus switched between encoding a prediction of what the bird was about to see and a reaction to what it actually saw. The temporal structure of these responses was coordinated by subclasses of interneurons that fired at different phases of the saccade. We suggest that place and gaze coding are components of a unified process by which the hippocampus represents the location that is currently relevant to the animal. This process allows the hippocampus to implement both local and remote spatial functions.
This dataset includes spike sorted data from all cells in the paper, as well as behavioral data from the corresponding sessions.
Dataset DOI: 10.5061/dryad.tqjq2bw9n
Description of the data and file structure
Parameters, behavioral, and neural data are given for each session in the paper. See Payne & Aronov 2025 for methodological details.
Files and variables
Data is included in the "data" folder. Files are stored in MATLAB (.mat) format.
SESSIONS.xlsx
Excel spreadsheet containing information for each experimental recording session.
SURGERY.xlsx
Excel spreadsheet containing information for each probe implant.
Session folders
In the format BIRD_YYMMDD
params.mat
Parameters for this session.
P_expmt: parameters stored during the experiment. Key parameters useful for analysis are:
task: task type
track_type: type of gaze used to trigger sites
p_feeders: locations of feeders and other session parameters.
r_gaze: gaze radius to trigger sites
P_calib: position and vectors of eye relative to head reference frame
v_eye_R_head: vector of the R eye relative to the head
v_eye_L_head: vector of the L eye relative to the head
p_eye_R_head: position of the R eye relative to the head
p_eye_L_head: position of the L eye relative to the head
p_beak_head: position of the beak tip relative to the head
Coordinates are given in the form [x,y,z], with the origin at the midpoint of the two eyes, the x axis pointing forward along the beak, the y axis pointing towards the left eye (but orthogonal to the x axis), and the z axis pointing up.
P_hmm: parameters for hidden markov model classification of behavioral states. Key parameters useful for analysis are:
state_text: description of each of the states in state_id
types: state types used for catergorization. "node" = fixations, "saccade" = saccades, "feed" = feeding, "edge" = dashing.
fps: frames per second of the behavioral data.
behavior.mat
Note: due to occasional gaps in tracking, the average frame rate is not equal to the mean difference in timestamps. For the frame rate, use variable fps in params.mat, calculated by removing long gaps in tt.
tt: [ntx1] time stamp for each behavior frame, as recorded in neural recording software. Use to align to spike data
p_head: [3×nt] head position relative to the arena.
R_head: [3×3×nt] head orientation relative to the arena. Axes are as described in P_beh
lin_speed: [nt×1] linear head speed. Calculated by taking the frame-by-frame difference in 3D head position, then smoothed with a lowpass Butterworth filter with 25 Hz cutoff frequency.
ang_speed: [ntx1] angular head speed. Calculated by taking the frame-by-frame difference in 3D head angle, then smoothed with a lowpass Butterworth filter with 25 Hz cutoff frequency.
state_id: [ntx1] states categorized by hidden Markov model algorithm. Refer to labels in params.mat
behavior_hmm.mat
T_sac: table of detected gaze saccades
T_fix: table of detected gaze fixations
T_dash: table of detected dashes
T_feed: table of detected feeding periods
events.mat
M: table of event times (time) and indices (ind, closest index into tt in behavior.mat) and what happened (light = light state changed, motor = motor state changed). A number from 1 to 5 indicates which site was opened or turned on. A 0 indicates that all sites were closed or turned off. Sites arranged as in variable p_feeders in params.mat.
spikes.mat
Structure array containing spike data for each hippocampal cell included in the paper
data: spike times
chanlabel: name of the cell, including session name and cell ID
chanval: channel on which the spike was recorded with max amplitude
samplerate: "event", indicating this is event rather than continuous data
tstart: time of first spike
tend: time of last spike
units: seconds
wavemark: unit ID given by kilosort 2.0
waveform: waveform on max channel
info: additional information
Code/software
Example MATLAB code is given in the "code" folder.
Edit RUNME.m so the data_dir variable points to where the data is stored on your computer.
Change directories to the "code" folder. Run RUNME.m.
Edit to plot results for different example cells.
