Please use this identifier to cite or link to this item: http://hdl.handle.net/1893/32410
Appears in Collections:Biological and Environmental Sciences Journal Articles
Peer Review Status: Refereed
Title: Robust ecological analysis of camera trap data labelled by a machine learning model
Author(s): Whytock, Robin C
Świeżewski, Jędrzej
Zwerts, Joeri A
Bara‐Słupski, Tadeusz
Koumba Pambo, Aurélie Flore
Rogala, Marek
Bahaa‐el‐din, Laila
Boekee, Kelly
Brittain, Stephanie
Cardoso, Anabelle W
Henschel, Philipp
Lehmann, David
Momboua, Brice
Orbell, Christopher
Abernethy, Katharine A
Keywords: artificial intelligence
biodiversity
birds
Central Africa
mammals
Issue Date: Jun-2021
Date Deposited: 11-Mar-2021
Citation: Whytock RC, Świeżewski J, Zwerts JA, Bara‐Słupski T, Koumba Pambo AF, Rogala M, Bahaa‐el‐din L, Boekee K, Brittain S, Cardoso AW, Henschel P, Lehmann D, Momboua B, Orbell C & Abernethy KA (2021) Robust ecological analysis of camera trap data labelled by a machine learning model. Methods in Ecology and Evolution, 12 (6), pp. 1080-1092. https://doi.org/10.1111/2041-210x.13576
Abstract: 1. Ecological data are collected over vast geographic areas using digital sensors such as camera traps and bioacoustic recorders. Camera traps have become the standard method for surveying many terrestrial mammals and birds, but camera trap arrays often generate millions of images that are time‐consuming to label. This causes significant latency between data collection and subsequent inference, which impedes conservation at a time of ecological crisis. Machine learning algorithms have been developed to improve the speed of labelling camera trap data, but it is uncertain how the outputs of these models can be used in ecological analyses without secondary validation by a human. 2. Here, we present our approach to developing, testing and applying a machine learning model to camera trap data for the purpose of achieving fully automated ecological analyses. As a case‐study, we built a model to classify 26 Central African forest mammal and bird species (or groups). The model generalizes to new spatially and temporally independent data (n = 227 camera stations, n = 23,868 images), and outperforms humans in several respects (e.g. detecting ‘invisible’ animals). We demonstrate how ecologists can evaluate a machine learning model's precision and accuracy in an ecological context by comparing species richness, activity patterns (n = 4 species tested) and occupancy (n = 4 species tested) derived from machine learning labels with the same estimates derived from expert labels. 3. Results show that fully automated species labels can be equivalent to expert labels when calculating species richness, activity patterns (n = 4 species tested) and estimating occupancy (n = 3 of 4 species tested) in a large, completely out‐of‐sample test dataset. Simple thresholding using the Softmax values (i.e. excluding ‘uncertain’ labels) improved the model's performance when calculating activity patterns and estimating occupancy but did not improve estimates of species richness. 4. We conclude that, with adequate testing and evaluation in an ecological context, a machine learning model can generate labels for direct use in ecological analyses without the need for manual validation. We provide the user‐community with a multi‐platform, multi‐language graphical user interface that can be used to run our model offline.
DOI Link: 10.1111/2041-210x.13576
Rights: © 2021 The Authors. Methods in Ecology and Evolution published by John Wiley & Sons Ltd on behalf of British Ecological Society This is an open access article under the terms of the Creative Commons Attribution‐NonCommercial License (https://creativecommons.org/licenses/by-nc/4.0/), which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.
Notes: Additional co-authors: Cisquet Kiebou Opepa, Ross T. Pitman, Hugh S. Robinson
Licence URL(s): http://creativecommons.org/licenses/by-nc/4.0/

Files in This Item:
File Description SizeFormat 
2041-210X.13576.pdfFulltext - Published Version2.51 MBAdobe PDFView/Open



This item is protected by original copyright



A file in this item is licensed under a Creative Commons License Creative Commons

Items in the Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

The metadata of the records in the Repository are available under the CC0 public domain dedication: No Rights Reserved https://creativecommons.org/publicdomain/zero/1.0/

If you believe that any material held in STORRE infringes copyright, please contact library@stir.ac.uk providing details and we will remove the Work from public display in STORRE and investigate your claim.