Show simple item record

dc.contributor.author Beijbom, Oscar
dc.contributor.author Treibitz, Tali
dc.contributor.author Kline, David I.
dc.contributor.author Eyal, Gal
dc.contributor.author Khen, Adi
dc.contributor.author Neal, Benjamin
dc.contributor.author Loya, Yossi
dc.contributor.author Mitchell, B. Greg
dc.contributor.author Kriegman, David
dc.coverage.spatial Eilat
dc.coverage.spatial Israel
dc.coverage.temporal 2015
dc.date.accessioned 2016-03-29T19:13:33Z
dc.date.available 2016-03-29T19:13:33Z
dc.date.issued 2016-03-29
dc.identifier doi:10.5061/dryad.t4362
dc.identifier.citation Beijbom O, Treibitz T, Kline DI, Eyal G, Khen A, Neal B, Loya Y, Mitchell BG, Kriegman D (2016) Improving automated annotation of benthic survey images using wide-band fluorescence. Scientific Reports 6: 23166.
dc.identifier.uri http://hdl.handle.net/10255/dryad.111099
dc.description Large-scale imaging techniques are used increasingly for ecological surveys. However, manual analysis can be prohibitively expensive, creating a bottleneck between collected images and desired data-products. This bottleneck is particularly severe for benthic surveys, where millions of images are obtained each year. Recent automated annotation methods may provide a solution, but reflectance images do not always contain sufficient information for adequate classification accuracy. In this work, the FluorIS, a low-cost modified consumer camera, was used to capture wide-band wide-field-of-view fluorescence images during a field deployment in Eilat, Israel. The fluorescence images were registered with standard reflectance images, and an automated annotation method based on convolutional neural networks was developed. Our results demonstrate a 22% reduction of classification error-rate when using both images types compared to only using reflectance images. The improvements were large, in particular, for coral reef genera Platygyra, Acropora and Millepora, where classification recall improved by 38%, 33%, and 41%, respectively. We conclude that convolutional neural networks can be used to combine reflectance and fluorescence imagery in order to significantly improve automated annotation accuracy and reduce the manual annotation bottleneck.
dc.relation.ispartofseries 6;;2016
dc.relation.haspart doi:10.5061/dryad.t4362/1
dc.relation.haspart doi:10.5061/dryad.t4362/2
dc.relation.haspart doi:10.5061/dryad.t4362/3
dc.relation.haspart doi:10.5061/dryad.t4362/4
dc.relation.haspart doi:10.5061/dryad.t4362/5
dc.relation.haspart doi:10.5061/dryad.t4362/6
dc.relation.haspart doi:10.5061/dryad.t4362/7
dc.relation.haspart doi:10.5061/dryad.t4362/8
dc.relation.haspart doi:10.5061/dryad.t4362/9
dc.relation.haspart doi:10.5061/dryad.t4362/10
dc.relation.isreferencedby doi:10.1038/srep23166
dc.relation.isreferencedby PMID:27021133
dc.subject Computer vision
dc.subject Machine learning
dc.subject Deep learning
dc.subject Multi-modal imaging
dc.subject Fluorescence
dc.subject Coral reefs
dc.subject Ecological surveys
dc.title Data from: Improving automated annotation of benthic survey images using wide-band fluorescence
dc.type Article
dwc.ScientificName Faviidae
dwc.ScientificName Stylophora
dwc.ScientificName Platygyra
dwc.ScientificName Acropora
dwc.ScientificName Pocillopora
dwc.ScientificName Millepora
dc.contributor.correspondingAuthor Beijbom, Oscar
prism.publicationName Scientific Reports
dryad.dansTransferDate 2018-05-07T18:03:13.307+0000
dryad.dansEditIRI https://easy.dans.knaw.nl/sword2/container/009f80fe-fa5b-4f7d-996d-f2a8bbf7f3d2
dryad.dansTransferFailed 2018-05-02T01:20:39.159+0000
dryad.dansArchiveDate 2018-05-07T19:02:19.022+0000

Files in this package

Content in the Dryad Digital Repository is offered "as is." By downloading files, you agree to the Dryad Terms of Service. To the extent possible under law, the authors have waived all copyright and related or neighboring rights to this data. CC0 (opens a new window) Open Data (opens a new window)

Title part0
Downloaded 76 times
Description This data package contains all images and point annotations used in the present publication. To access the data download all parts, and then merge using the following command in a linux shell: cat data_package* > merged_data.zip. Then unzip the archive to access the data.
Download README.txt (2.2 Kb)
Download data_package00 (1.073 Gb)
Details View File Details
Title part1
Downloaded 37 times
Download data_package01 (1.073 Gb)
Details View File Details
Title part2
Downloaded 44 times
Download data_package02 (1.073 Gb)
Details View File Details
Title part3
Downloaded 44 times
Download data_package03 (1.073 Gb)
Details View File Details
Title part4
Downloaded 28 times
Download data_package04 (1.073 Gb)
Details View File Details
Title part5
Downloaded 30 times
Download data_package05 (1.073 Gb)
Details View File Details
Title part6
Downloaded 27 times
Download data_package06 (1.073 Gb)
Details View File Details
Title part7
Downloaded 23 times
Download data_package07 (1.073 Gb)
Details View File Details
Title part8
Downloaded 22 times
Download data_package08 (1.073 Gb)
Details View File Details
Title part9
Downloaded 24 times
Download data_package09 (872.9 Mb)
Details View File Details

Search for data

Be part of Dryad

We encourage organizations to: