Zebrafish larvae exploration and aversive chemotaxis dataset
Reddy, Gautam et al. (2021), Zebrafish larvae exploration and aversive chemotaxis dataset, Dryad, Dataset, https://doi.org/10.5061/dryad.6t1g1jwwz
This dataset contains recordings of larval zebrafish behavior. The full details are described in the paper "A lexical approach for identifying behavioral action sequences".
The experiment investigates zebrafish larvae behavior in free swimming and aversive chemotaxis conditions. In each experiment, 12 larvae (7 dpf) are placed in 12 rectangular wells. Ten min-long videos were recorded at 160 Hz with an exposure time of 1 ms, and a pixel size of 70 µm using a ViewWorks camera (Basler acA2040-180km) controlled by the Hiris software (R&D Vision, Nogent sur Marne, http://www.rd-vision.com/r-d-vision-eng).
The fish are tracked using a custom-made software, Zebrazoom (https://zebrazoom.org/). The algorithm begins by locating all the wells and by extracting the background of the video. ZebraZoom first applies a series of actions to detect the animal in each well: i) contours of head and entire body are detected using active contours, ii) the center of the head is identified as the center of mass of the head contour and the tip of the tail is detected using both the curvature along the body contour and distance to the center of the head. The midline is then identified between the left and right borders of the body contour. For each animal, the difference in pixel intensity between subsequent frames enables the automated detection of bout start and end. Then, for each bout, the algorithm calculates the head position, head direction and the tail angle from which kinematic parameters are subsequently estimated: number of oscillations, instantaneous tail beat frequency, maximum amplitude for each tail bend, bout speed, bout duration, and bout distance. Tunable parameters in the tracking algorithm were optimized to detect small amplitude forward bouts occurring frequently during exploration. In order to validate our algorithm, we manually inspected validation videos where the head direction and tail position were superimposed on the raw image when a bout is detected, allowing to check both the tracking and bout detection quality.
The dataset contains MATLAB files which can be read using the Python code uploaded along with the dataset. The codebase also includes a Cython implementation of the BASS algorithm. The ReadMe for using the Python code to analyze the larval zebrafish dataset and for using BASS is included with the code.
The dataset was collected using a zebrafish behavior assay at the Institut du Cerveau (ICM) and has been processed using Zebrazoom to a produce a MS accepted for publication in PLoS Computational Biology.
The processed larval zebrafish data (in MATLAB file format) is compressed in "Data_all" and the code is archived in "BASS-master.zip". The Python notebook "Zebrafish_larvae_analysis_acid_data_final.ipynb" contains the functions to read and analyze the larval zebrafish data. The ReadMe file contains an explanation of how to use BASS. Information on how the data was collected can be found in the associated manuscript referenced above.
National Science Foundation, Award: PHY-1748958
National Institutes of Health, Award: R25GM067110
Gordon and Betty Moore Foundation, Award: 2919.01
New York Stem Cell Foundation, Award: NYSCF-R-NI39
Human Frontier Science Program, Award: RGP0063/2018
Fondation Schlumberger pour l’Education et la Recherche, Award: FSER/2017
Investissements d’avenir, Award: ANR-10- IAIHU-06
NeurATRIS: Translational Research Infrastructure for Biotherapies in Neurosciences, Award: ANR-11-INBS-0011
European Research Council, Award: ERC-POC-2018#825273