Skip to main content
Dryad

Data from: Active head movements contribute to spatial updating across gaze shifts

Cite this dataset

Bayer, Manuel; Zimmermann, Eckart (2024). Data from: Active head movements contribute to spatial updating across gaze shifts [Dataset]. Dryad. https://doi.org/10.5061/dryad.d51c5b09x

Abstract

Keeping visual space constant across movements of the eye and head is a yet not fully understood feature of perception. To understand the mechanisms that update the internal coordinates of space, research has mostly focused on eye movements. However, in natural vision head movements are an integral part of gaze shifts that enlarge the field of vision. Here, we directly compared spatial updating for eye and head movements. In a virtual reality environment, participants had to localize the position of a stimulus across the execution of a gaze shift. We found that performing head movements increased the accuracy of spatial localization. By manipulating the speed of the visual scene displacement that a head movement produced, we found that spatial updating takes into account the sensorimotor contingencies of vision. Traditional accounts of perception during gaze shifts assume that self-produced changes of vision are suppressed. In direct contrast to this theory, we find that self-produced changes in vision are analyzed by the sensorimotor system and used to monitor the displacement vector of the head. We conclude that head movements contribute to stabilizing visual space across gaze shifts and that contingencies of head movements, rather than being cancelled, facilitate the updating.

README: Active head movements contribute to spatial updating across gaze shifts

https://doi.org/10.5061/dryad.d51c5b09x

Data is pre-processed and saved as pickle file (a python specific file format). Data can be opened in python with the package pandas and the function read_pickle(). Raw data is too big for an upload. 

Names of the data files consist the following information: SubjectID, what moved in this condition (e.g. eyes, head and eyes), the restriction imposed on the performed gaze shift (unrestricted = movement, restricted (eyes had to reach the gaze shift target prior to any head movement) = no movement), the visibility of the targets the gaze shift had to be performed to (permanent targets = start and target stimulus were permanently visible, temporary targets = start and target stimulus were only visible when needed to perform the gaze shift), the background during the performance of the gaze shift (gabor = whole-field grating, no gabor = grey background), the data the respective file contains (e.g. eye = eye tracking data). An example would be "AD21_Eye_Movement_PermanentTargets_Gabor_19.6.2023_eye": data of subject AD21 in the condition where only the eyes moved, the gaze shift was performed unrestrictedly, the start and target stimulus were present the whole trial and there was a whole-field grating present during the performance of the gaze shift and the data file contains the information of the recorded eye-tracking.

Most data files contain a time column (in ns, in relation to the start of the relative day the data was recorded on) and columns for the different axes of the tracking data, in case of the eye tracking data, it is for example the tracking data on the y-axis (horizontal) and z-axis (vertical axis). In the case of the head tracking files, the files contain information about the pitch, yaw and roll. The timing files contain information about the onset and offset of the different visual cues and stimuli visible during the individual trials, this data can be used to cut the tracking data to the individual trials.

Methods

Data is preprocessed and saved as pickle file (a python specific file format). Data can be opened in python with the package pandas and the function read_pickle(). Raw data is too big for an upload.

Funding

European Research Council, Award: 757184–moreSense, European Union's Horizon 2020 research and innovation program