Skip to main content
Dryad

A cognitive process occurring during sleep revealed by rapid eye movements

Cite this dataset

Senzai, Yuta; Scanziani, Massimo (2023). A cognitive process occurring during sleep revealed by rapid eye movements [Dataset]. Dryad. https://doi.org/10.7272/Q6P26WDC

Abstract

Since the discovery of REM sleep, the nature of the rapid eye movements that characterize this sleep phase has remained elusive. Do they reveal gaze shifts in the virtual environment of dreams or simply reflect random brainstem activity? We harnessed the head direction (HD) system of the mouse thalamus, a neuronal population whose activity reports, in awake mice, their actual HD as they explore their environment and, in sleeping mice, their virtual HD. We discovered that the direction and amplitude of rapid eye movements during REM sleep reveal the direction and amplitude of the ongoing changes in virtual HD. Thus, rapid eye movements disclose gaze shifts in the virtual world of REM sleep, thereby providing a window in the cognitive processes of the sleeping brain.

Methods

Extracellular electrophysiological recordings

We recorded from the anteriodorsal nucleus of the thalamus (ADN) of mice, both in their home cage as well as while exploring the open field arena for food, for 60-120 min sessions. The entire recording amounted to a total of 6 to 12 hours per mouse. Electrophysiological data were acquired using an Intan RHD2000 system (Intan Technologies LLC) band passed filtered between 0.1 Hz and 7.5 kHz and digitized at 20 kHz. Spike sorting was performed semi-automatically, using Kilosort 2.0. This was followed by manual adjustment of the waveform clusters using the software Phy. For the local field potential (LFP), the wide-band signal was down-sampled to 1.25 kHz.

Eye tracking

We adopted the head-mounted eye tracking system described in Meyer et al. 2018. Light-weight camera modules (Arducam, with 5MP Omnivision OV5647 sensor) were used for eye tracking, one for each eye. Each camera was inserted into a 3D printed camera holder integrated in the head plate. Each eye was illuminated by an infrared LED (VSMB2943GX01, Vishay), also integrated in the head plate. Videos from the head mounted cameras were acquired at a frame rate of 90 Hz using single-board computers (Raspberry Pi 3 model B, Raspberry Pi Foundation), one for each camera, and controlled using a software described in Meyer et al. 2018 and reposed in their github.

Open field arena

The rectangular open field arena (50 × 28 cm) was surrounded by 25 cm high walls displaying salient visual cues. The arena base and the walls were made from white plastic. The arena was illuminated by visible room lights as well as by infrared LED lights for video acquisition. The video (50 Hz sampling rate) was captured by a CMOS camera (Basler acA1300-200um) placed above the arena with infrared pass filter (Hoya) in front of the lens. The heading of the animal was detected based on the vector connecting the neck to the nose of the mouse. The extraction of the coordinates of the nose and the neck was performed by DeepLabCut. The heading was up-sampled to 100Hz by linear interpolation and smoothing (running average with a 210 ms window size).

Usage notes

Matlab or Python can be used to open the data files. 

Neuroscope (http://neurosuite.sourceforge.net/) can be also used to open the eeg files.

Funding

National Institute of Neurological Disorders and Stroke, Award: U19NS107613

National Eye Institute, Award: R01EY025668

Howard Hughes Medical Institute

Japan Society for the Promotion of Science