Data from: Sensation of electric fields in Drosophila melanogaster
Data files
Mar 28, 2025 version files 34.45 GB
-
Behavior.zip
2.94 GB
-
Ca_imaging_data.zip
29.21 GB
-
IHC.zip
2.30 GB
-
README.md
21.05 KB
-
simulations.zip
63.59 KB
Abstract
Electrosensation has emerged as a crucial sensory modality for social communication, foraging, and predation across the animal kingdom. However, its presence and functional role as well as the neural basis of electric field perception in Drosophila and other invertebrates remain unclear. In environments with controlled electric fields, we identified electrosensation as a new sense in the Drosophila melanogaster larva. We found that the Drosophila larva performs robust electrotaxis: when exposed to a uniform electric field, larvae migrate toward the cathode (negative potential) and quickly respond to changes in the orientation of the field to maintain cathodal movement. Through a behavioral screen, we identified a subset of sensory neurons located at the tip of the larval head that are necessary for electrotaxis. Calcium imaging revealed that a pair of Gr66a-positive sensory neurons (one on each side of the head) encodes the strength and orientation of the electric field. Our results indicate that electric fields elicit robust behavioral and neural responses in the Drosophila larva, providing new evidence for the significance of electrosensation in invertebrates.
This repository contains all data that is used to run all the code found on https://gitlab.com/davidtadres/electrotaxis_publication to create all figures presented in the paper titled ‘Sensation of electric fields in Drosophila melanogaster’
Description of the Data and file structure
Folder ‘Behavior’ contains all behavioral data, folder ‘Ca_imaging_data’ contains all data related to calcium imaging, folder ‘simulation’ contains all data related to simulations and folder ‘IHC’ contains all immunohistochemistry images.
We recommend to use this repository in the following order:
- Identify the figure in the manuscript one wants to re-plot (e.g. Figure 1F)
- download the repository containing all code using the files found here.
- Follow the Readme of the repository to point the code to the data from datadryad
- In the folder ‘eletrotaxis_publication’ find the jupyter notebook with the desired figure name (“Fig1_multi_animal_assay.ipynb”).
- Open the jupyter notebook and run the script. This should produce the unedited plot that was used for that figure.
Below is detailed information about each file type found in this repository.
Folder ‘Behavior’ contains exclusively data collected with PiVR (https://doi.org/10.1371/journal.pbio.3000712, www.PiVR.org). Data in the folders ‘pulse_experiment’ and ‘switch’ are single animal experiments. Navigation indeces were calculated by first running the following script: ‘electrotaxis_publication/Behavior_analysis/single_animal_assay/Calculate_Navigation_Index.py’ Data in the folder ‘static’ were multi animal experiments. After digitization of the animal position, scripts in ‘electrotaxis_publication/Behavior_analysis/multi_animal_assay’ were run to calculate the preference index. Please read the methods in the manuscript ‘Sensation of electric fields in Drosophila melanogaster’ for more information.
The structure of the ‘Behavior’ folder is as follows:
- Behavior
- type of experiment (e.g. ‘pulse_experiment’ or ‘static’).
We first focus on ‘pulse_experiment’ and ‘switch’ as those were single animal experiments- genotype (sometimes this is 2 folders deep, e.g. ‘Gr66a/MS193xMS581’ as a number of controls were run for each genotype).
- some folders contain quality control and other files:
- ‘CentroidSpeed.png’ is calculated in file
‘Calculate_Navigation_Index.py’. Similar to the tail speed
shown in the figures. Less precise (tail indicates speed better)
but more robust (centroid position is generally more correct than
tail position before Head/Tail correction). - ‘median_nav_index.npy’ is calculated in file
‘Calculate_Navigation_Index.py’. It is the raw median Navigation Index
at each timepoint. - ‘Navigation Index.png’ is calculated in file
‘Calculate_Navigation_Index.py’. It’s a plot of ‘median_nav_index.npy’. - ‘mean_corrected_nav_index.npy’ is calculated in file
‘Calculate_Navigation_Index.py’. It is the mean corrected
(meaning it takes into account the polarity of the e-field)
Navigation Index per experiment. - ‘voltages.npy’ is calculated in file
‘Calculate_Navigation_Index.py’. These are the voltages correctly
aligned between experiments. - individual experiments (e.g. ‘2023.06.14_16-40-17_pulse exp’)
Each experimental folder contains a set of files which are the
standard output of PiVR as explained here:
https://pivr.readthedocs.io/en/latest/output.html#tracking.- DATE_TIME_data.csv” contains the following columns
- The frame (=image) number into the experiment
- The time in seconds since the experiment started
- The X (column) coordinate of the Centroid
- The Y (row) coordinate of the Centroid
- The X (column) coordinate of the head
- The Y (row) coordinate of the head
- The X (column) coordinate of the tail
- The Y (row) coordinate of the tail
- The X (column) coordinate of the midpoint
- The Y (row) coordinate of the midpoint
- The Y-min (row) coordinate of the bounding box (If
missing see ‘bounding_boxes.npy below) - The Y-max (row) coordinate of the bounding box (If
missing see ‘bounding_boxes.npy below) - The X-min (row) coordinate of the bounding box (If
missing see ‘bounding_boxes.npy below) - The X-max (row) coordinate of the bounding box (If
missing see ‘bounding_boxes.npy below) - The local threshold used to extract the binary image
during tracking (Missing in older experiments) - the presented stimulus
In addition there are two more columns which are special to
the ‘PiVR electrotaxis’ version:- The ‘measured_voltage’ during the relevant frame. Usually
empty as the voltmeter was left unplugged for experiments as
it introduced another electrical connectivity between the anode
and the cathode, potentially lowering the electric field. - The ‘measured_generic_value’ are the readings of a
Yocto-milliVolt-Rx in parallel to the electric circuit with
a shunting resistor of 5.1 Ohm.
- Background.tiff:
- Contains the reconstructed background image.
- “*.csv”, for example ‘10Hz_90Vswitch.csv’ or ‘10Hz_pulse_protocol.csv’
is the stimulus file PiVR used to present the electric field stimuli
during that particular experiment. - “experiment_settings.json” is a json file and contains a
lot of useful experimental information:- Camera Shutter Speed [us]: Shutter speed in microseconds
- Exp. Group: The string that was entered by the user
during the experiment - Experiment Date and Time: exactly that
- Framerate: The frequency at which PiVR tracked the animal
- Model Organism: While tracking, PiVR used the parameters
of this animal to optimize tracking. See here for how to
modify this parameter. - PiVR info (recording): version number, git branch and
git hash of the PiVR software that was used to record the
experiment. - PiVR info (tracking): version number, git branch and
git hash of the PiVR software that was used to track the
experiment. If online tracking is being done, this is
identical to the info above. - Pixel per mm: For PiVR to be able to track the animal,
it needs to know how many pixels indicate one mm. This has
been set by the user as described here. - Recording time: The time in seconds that PiVR was
tracking the animal - Resolution: The camera resolution in pixels that PiVR
used while tracking. - Time delay due to Animal Detection[s]: For the
autodetection the animal must move. The time it took between
pressing start and successful animal detection is saved
here. - Virtual Reality arena name: If no virtual arena was
presented, it will say None - backlight 2 channel: If Backlight 2 has been defined
(as described here) the chosen GPIO (e.g. 18) and the
maximal PWM frequency (e.g. 40000) is saved as a [list]. - backlight channel: If Backlight 1 has been defined
the chosen GPIO (e.g. 18) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. This would normally be
defined as [18, 40000]. - output channel 1: If Channel 1 has been defined
the chosen GPIO (e.g. 17) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. - output channel 2: If Channel 2 has been defined
the chosen GPIO (e.g. 27) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. - output channel 3: If Channel 3 has been defined
the chosen GPIO (e.g. 13) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. - output channel 4: If Channel 4 has been defined
the chosen GPIO (e.g. 13) and the maximal PWM frequency
(e.g. 40000) is saved as a [list].
- “first_frame_data.json” is a json file and contains information
that collected during animal detection.- bounding box col max: The X_max value of the bounding box
of the animal detected in the first frame during animal detection. - bounding box col min: The X_min value of the bounding box
of the animal detected in the first frame during animal detection. - bounding box row max: The Y_min value of the bounding box
of the animal detected in the first frame during animal detection. - bounding box row min: The Y_max value of the bounding box
of the animal detected in the first frame during animal detection. - centroid col: The X value of the centroid of the animal
detected in the first frame during animal detection. - centroid row: The Y value of the centroid of the animal
detected in the first frame during animal detection. - filled area: The filled area in pixels of the blob defined
as the animal in the first frame during animal detection
- bounding box col max: The X_max value of the bounding box
- “sm_raw.npy” is a numpy file containing the small image of
the the tracked animal. This file comes in shape
[“y size”, “x size”, # of frames] with “y size” == “x size”. - “sm_thresh.npy” is a numpy file containing the binarized
small image of the the tracked animal. This file comes in shape
[“y size”, “x size”, # of frames] with “y size” == “x size”. - “sm_skeletons.npy” is a numpy file containing the skeleton
of the tracked animal. This file comes in shape
[“y size”, “x size”, # of frames] with “y size” == “x size”. - “undistort_matrices.npz” contains the undistort files used
to correct for lens distortion of the image by the opencv
function cv2.undistortPoints. See
https://pivr.readthedocs.io/en/latest/manual_software.html#undistort-options
for more information. - “Overview of tracking.png” shows the trajectory of the
animal using the coordinates saved in “DATE_TIME_data.csv”. - Some folders contain additional files. This is because instead
of real-time tracking, a video was recorded and tracking was
done post-hoc:
- DATE_TIME_data.csv” contains the following columns
- “IDENTIFIER_Video.h264” is the video that was recorded during
the experiment.
- ‘CentroidSpeed.png’ is calculated in file
- some folders contain quality control and other files:
- genotype (sometimes this is 2 folders deep, e.g. ‘Gr66a/MS193xMS581’ as a number of controls were run for each genotype).
- Next, we look at the ‘static’ folder:
- genotype (sometimes this is 2 folders deep, e.g. ‘Gr66a/MS193xMS581’
as a number of controls with each genotype were run).- Individual experiments (e.g. 2023.02.06_14-59-54_MS478xMS478_pos90V)
Each experimental folder contains a set of files which are the
standard output of PiVR multianimal tracker output as explained here:
https://pivr.readthedocs.io/en/latest/tools.html#multi-animal-tracking
in addition to analysis files described here:- “DATE_TIME_stimulation.csv” was written by PiVR during the video recording:
and contains the following columns:- The frame (=image) number into the experiment
- The time in seconds since the experiment started. Note that due to a (now fixed)
bug in the PiVR software, the time must be divided by 1000000 to get the time in
seconds. - The stimulus provided by Channel 1 (the only one relevant for the data here).
- The stimulus provided by Channel 2 (not used in this study)
- The stimulus provided by Channel 3 (not used in this study)
- The stimulus provided by Channel 4 (not used in this study)
- The ‘measured_voltage’ during the relevant frame. Usually
empty as the voltmeter was left unplugged for experiments as
it introduced another electrical connectivity between the anode
and the cathode, potentially lowering the electric field. - The ‘measured_generic_value’ are the readings of a
Yocto-milliVolt-Rx in parallel to the electric circuit with
a shunting resistor of 5.1 Ohm.
- “Background.npy” is the mean image created during tracking.
- ‘defined_mideline.json’ is created by script
Behavior_analysis/multi_animal_assay/3_cathode_attraction_folders.py with
user input. - “experiment_settings.json” is a json file and contains a
lot of useful experimental information:- Camera Shutter Speed [us]: Shutter speed in microseconds
- Exp. Group: The string that was entered by the user
during the experiment - Experiment Date and Time: exactly that
- Framerate: The frequency at which PiVR tracked the animal
- Model Organism: While tracking, PiVR used the parameters
of this animal to optimize tracking. See here for how to
modify this parameter. - PiVR info (recording): version number, git branch and
git hash of the PiVR software that was used to record the
experiment. - PiVR info (tracking): version number, git branch and
git hash of the PiVR software that was used to track the
experiment. If online tracking is being done, this is
identical to the info above. - Pixel per mm: For PiVR to be able to track the animal,
it needs to know how many pixels indicate one mm. This has
been set by the user as described here. - Recording time: The time in seconds that PiVR was
tracking the animal - Resolution: The camera resolution in pixels that PiVR
used while tracking. - backlight 2 channel: If Backlight 2 has been defined
(as described here) the chosen GPIO (e.g. 18) and the
maximal PWM frequency (e.g. 40000) is saved as a [list]. - backlight channel: If Backlight 1 has been defined
the chosen GPIO (e.g. 18) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. This would normally be
defined as [18, 40000].
13. output channel 1: If Channel 1 has been defined
the chosen GPIO (e.g. 17) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. - output channel 2: If Channel 2 has been defined
the chosen GPIO (e.g. 27) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. - output channel 3: If Channel 3 has been defined
the chosen GPIO (e.g. 13) and the maximal PWM frequency
(e.g. 40000) is saved as a [list]. - output channel 4: If Channel 4 has been defined
the chosen GPIO (e.g. 13) and the maximal PWM frequency
(e.g. 40000) is saved as a [list].
- “DATE_TIME_stimulation.csv” was written by PiVR during the video recording:
- “stimulation_used.csv” was the stimulus file used to present the e-field stimulus
during the video recording. - “Video.h264” is created during the video recording.
- “Video_undistorted.avi” is the undistorted version of the “Video.h264”
done by PiVR (https://pivr.readthedocs.io/en/latest/tools.html#undistort-video)
which was used for further analysis. - “XY_postion.csv” are the x/y positions of larvae as tracked by the PiVR
Multi Animal Tracker. - “interpolated_XY_position.csv” is calculated in script
Behavior_analysis/multi_animal_assay/2.5_handling_missing_animals.py
which takes care of “lost animals”. These are used to calculate the FractionOnCathodeSide.npy
below - “FractionOnCathodeSide.npy” is calculated in script
Behavior_analysis/multi_animal_assay/3_cathode_attraction_folders.py
and contains the fraction of animals on the cathode side over time.
- Individual experiments (e.g. 2023.02.06_14-59-54_MS478xMS478_pos90V)
- genotype (sometimes this is 2 folders deep, e.g. ‘Gr66a/MS193xMS581’
- type of experiment (e.g. ‘pulse_experiment’ or ‘static’).
The structure of the ‘Ca_imaging_data’ folder is as follows:
- genotype + stimulus (e.g. Gr66a_5V_step or NP2728_15s_5V)
- individiual experiments (e.g. 20221220_17-00-31_1V_2s_step) which
contains the imaging data, the stimulus information and analysis.- The “*.tiff” file is the imaging file (x,y,t) containing the raw imaging data
collected at the microscope. - The “*_AI_measurements.npy” were created during recording by the
“calcium_imaging/hyperscope_GUI.py” script. It contains the measured voltage provided
during the experiment. - “signal_ROI.npy” contains coordinates of the user defined ROI created with
“calcium_imaging/calc_dF_F.py” script. - “DATE_TIME_delta_F_over_F_ROIX.csv” is the output of the “calcium_imaging/calc_dF_F.py”
script. It contains the following columns:- “Time [s]” indicates time in the experiment per second.
- “Stimulation AO1 [V]” is the measured voltage on AI1 (anterior <-> posterior e-field).
- “Stimulation AO3 [V]” is the measured voltage on AI3 (lateral e-field).
- “dF/F no 1” is the dF/F of the ROI indicated in the filename.
- “DATE_TIME__stepdF_over_F.png” is a visualization of the data
“DATE_TIME_delta_F_over_F_ROIX.csv”.
- The “*.tiff” file is the imaging file (x,y,t) containing the raw imaging data
- individiual experiments (e.g. 20221220_17-00-31_1V_2s_step) which
The structure of the “simulations” folder:
- e_fields_PB
- “*V_axial_E_field.txt” contains COMSOL simulation. The first row
is the position in meters, the second the electric field strength in V/m.
See name for simulation parameters (e.g. 8mm agarose height, 90V applied voltage
difference.)
- “*V_axial_E_field.txt” contains COMSOL simulation. The first row
- PB_measurements_and_simulations
- “Electrotaxis_agarose_gel_and_PB_conductivities_hot_cool_100uM_to_40mM_data.txt”
contains conductivity measurements for hot and cool PB, Agarose made with PB and
theoretical conducitivty measurements. - “Gel_electrophoresis_estimated_Temp_rise_vs_PB_concentration_mM.txt” first column
contains PB concentration (in mM) and second columns contains
estimated temperature rise (see Methods). - “Gel_electrophoresis_Normalized_E_field(V per m)_vs_PB_concentration_mM.txt”, first
column contains PB concentration (in mM), second contains normalize e-field above the
agarose (see Methods).
- “Electrotaxis_agarose_gel_and_PB_conductivities_hot_cool_100uM_to_40mM_data.txt”
The structure of the “IHC” folder is as follows:
- Figure 4A
- Raw image files “.lif” and adjusted image shown in figure as “.jpg”. These images show NP2729-Gal4, tsh-Gal80 expression pattern in the anterior part of the animal.
- Figure 5A
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show Gr66a-Gal4 and Gr33a-Gal4 expression pattern in the anterior part of the animal.
- Figure S6A
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show NP2729-Gal4 expression pattern in the brain.
- Figure S6B
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show NP2729-Gal4, tsh-Gal80 expression pattern in the brain.
- Figure S7D
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show Gr66a-Gal4 expression pattern in the anterior part of the animal.
- Figure S7E
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show Gr66a-lexA expression pattern in the anterior part of the animal.
- Figure S7F
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show Gr66a-lexA and Gr66a-Gal4 expression pattern in the anterior part of the animal.
- Figure S7G
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show Gr66a-lexA and Gr33a-Gal4 expression pattern in the anterior part of the animal.
- Figure S7H
- Raw image files “.tif” and adjusted image shown in figure as “.jpg”. These images show Gr66a-lexA and NP2729-Gal4, tsh-Gal80 expression pattern in the anterior part of the animal.