Data from: A spatiotemporal analysis of acoustic interactions between great reed warblers (Acrocephalus arundinaceus) using microphone arrays and robot audition software HARK
Suzuki, Reiji et al. (2018), Data from: A spatiotemporal analysis of acoustic interactions between great reed warblers (Acrocephalus arundinaceus) using microphone arrays and robot audition software HARK, Dryad, Dataset, https://doi.org/10.5061/dryad.n378d
Acoustic interactions are important for understanding intra- and interspecific communication in songbird communities from the viewpoint of soundscape ecology. It has been suggested that birds may divide up sound space to increase communication efficiency in such a manner that they tend to avoid overlap with other birds when they sing. We are interested in clarifying the dynamics underlying the process as an example of complex systems based on short-term behavioral plasticity. However, it is very problematic to manually collect spatiotemporal patterns of acoustic events in natural habitats using data derived from a standard single-channel recording of several species singing simultaneously. Our purpose here is to investigate fine-scale spatiotemporal acoustic interactions of the great reed warbler. We surveyed spatial and temporal patterns of several vocalizing color-banded great reed warblers (Acrocephalus arundinaceus) using an open source software for robot audition HARK (Honda Research Institute Japan Audition for Robots with Kyoto University) and three new 16-channel, stand-alone, and water-resistant microphone arrays, named DACHO spread out in the bird's habitat. We first show that our system estimated the location of two color-banded individuals' song posts with mean error distance of 5.5 ± 4.5 m from the location of observed song posts. We then evaluated the interdigitation of the temporal pattern of localized songs by comparing the duration of localized songs with those annotated by human observers, with an accuracy score of average 0.89% for one bird that stayed at one song post. We found significant temporal overlap avoidance and an asymmetric relationship between songs of the two singing individuals, using transfer entropy. We believe that our system and analytical approach contribute to a better understanding of fine-scale acoustic interactions in time and space in bird communities.