Inferring predator-prey interactions from camera traps: A Bayesian co-abundance modelling approach
Cite this dataset
Amir, Zachary; Sovie, Adia; Luskin, Matthew S. (2022). Inferring predator-prey interactions from camera traps: A Bayesian co-abundance modelling approach [Dataset]. Dryad. https://doi.org/10.5061/dryad.b8gtht7h3
Predator-prey dynamics are a fundamental part of ecology, but directly studying interactions has proven difficult. The proliferation of camera trapping has enabled the collection of large datasets on wildlife, but researchers face hurdles inferring interactions from observational data. Recent advances in hierarchical co-abundance models infer species interactions while accounting for two species’ detection probabilities, shared responses to environmental covariates, and propagate uncertainty throughout the entire modelling process. However, current approaches remain unsuitable for interacting species whose natural densities differ by an order of magnitude and have contrasting detection probabilities, such as predator-prey interactions, which introduce zero-inflation and overdispersion in count histories. Here we developed a Bayesian hierarchical N-mixture co-abundance model that is suitable for inferring predator-prey interactions. We accounted for excessive zeros in count histories using an informed zero-inflated Poisson distribution in the abundance formula and accounted for overdispersion in count histories by including a random effect per sampling unit and sampling occasion in the detection probability formula. We demonstrate that models with these modifications outperform alternative approaches, improve model goodness-of-fit, and overcome parameter convergence failures. We highlight its utility using 20 camera trapping datasets from 10 tropical forest landscapes in Southeast Asia and estimate four predator-prey relationships between tigers, clouded leopards, and muntjac and sambar deer. Tigers had a negative effect on muntjac abundance, providing support for top-down regulation, while clouded leopards had a positive effect on muntjac and sambar deer, likely driven by shared responses to unmodelled covariates like hunting. This Bayesian co-abundance modelling approach to quantify predator-prey relationships is widely applicable across species, ecosystems, and sampling approaches, and may be useful in forecasting cascading impacts following widespread predator declines. Taken together, this approach facilitates a nuanced and mechanistic understanding of food-web ecology.
This dataset is a subset of 20 large systematic camera trapping sessions conducted across 10 landscapes in Southeast Asian primary tropical forests. The manuscript describes a new method of analyzing camera trap data to infer predator-prey species interactions and is well described in the manuscript. The camera trap data has already been converted to count history matrices and spatial covariates have already been generated, and both are saved as .csv files. The repository also contains completed co-abundance modes which are saved as .RDS files.
The repository contains a .zip file that contains all R code, datasets (.csv files), and completed co-abundance models (.RDS files) to run your own co-abundance model and generate the same figures included in the manuscript. Individual data files (e.g., count_history_matrix) and completed models (e.g., OG models) are saved in sub-folders within the .zip file. In addition to the .zip file, all individual files within the compressed .zip file have been added to the repository to help facilitate examination of individual data files. The repository contains a very detailed README file that includes a series of questions to determine if running a co-abundance model is the right option for you and detailed descriptions of each data type. The R script included in the repository is well ##hashed## out and explained clearly throughout the document.
National Geographic Society Committee for the Research and Exploration, Award: 9384–13
Australian Research Council, Award: Discovery Early Career Research Award DE210101440