Data from: A convolutional neural network for detecting sea turtles in drone imagery
1. Marine megafauna are difficult to observe and count because many species travel widely and spend large amounts of time submerged. As such, management programs seeking to conserve these species are often hampered by limited information about population levels.
2. Unoccupied aircraft systems (UAS, aka drones) provide a potentially useful technique for assessing marine animal populations, but a central challenge lies in analyzing the vast amounts of data generated in the images or video acquired during each flight. Neural networks are emerging as a powerful tool for automating object detection across data domains and can be applied to UAS imagery to generate new population-level insights. To explore the utility of these emerging technologies in a challenging field setting, we used neural networks to enumerate olive ridley turtles (Lepidochelys olivacea) in drone images acquired during a mass-nesting event on the coast of Ostional, Costa Rica.
3. Results revealed substantial promise for this approach; specifically, our model detected 8% more turtles than manual counts while effectively reducing the manual validation burden from 2,971,554 to 44,822 image windows. Our detection pipeline was trained on a relatively small set of turtle examples (N=944), implying that this method can be easily bootstrapped for other applications, and is practical with real-world UAS datasets.
4. Our findings highlight the feasibility of combining UAS and neural networks to estimate population levels of diverse marine animals and suggest that the automation inherent in these techniques will soon permit monitoring over spatial and temporal scales that would previously have been impractical.
National Science Foundation,