Skip to main content

Optic flow and odometry data from intelrealsense camera

Cite this dataset

van Breugel, Floris (2021). Optic flow and odometry data from intelrealsense camera [Dataset]. Dryad.


Insects rely on the perception of image motion, or optic flow, to estimate their velocity relative to nearby objects. This information provides important sensory input for avoiding obstacles. However, certain behaviors, such as estimating the absolute distance to a landing target, accurately measuring absolute distance travelled, and estimating the ambient wind speed require decoupling optic flow into its component parts: absolute ground velocity and distance to nearby objects. Behavioral experiments suggest that insects perform these calculations, but their mechanism for doing so remains unknown. Here we present a novel algorithm that combines the geometry of dynamic forward motion with known features of insect visual processing to provide a hypothesis for how insects might \textit{directly} estimate absolute ground velocity from a combination of optic flow and acceleration information. Our robotics-inspired-biology approach reveals three critical requirements. First, absolute ground velocity can only be directly estimated from optic flow during times of active acceleration and deceleration. Second, spatial pooling of optic flow across a receptive field helps to alleviate the effects of noise and/or low resolution visual systems. Third, averaging velocity estimates from multiple receptive fields further helps to reject noise. Our algorithm provides a hypothesis for how insects might estimate absolute velocity from vision during active maneuvers, and also provides a theoretical framework for designing fast analog circuitry for efficient state estimation that can be applied to insect-sized robots.   


Data was collected with an Intel Realsense camera, and an overhead tracking camera. See paper for detail.

Usage notes

Data is intended to be used by the code given at 


National Institute of General Medical Sciences, Award: P20GM103650

Alfred P. Sloan Foundation, Award: FG-2020- 13422