U.S. Department of Agriculture: Animal and Plant Health Inspection Service



Benjamin F. Martini http://orcid.org/0000-0002-8874-3997

Date of this Version



Martini, B.F., and D.A. Miller. 2021. Using object-based image analysis to detect laughing gull nests. GIScience & Remote Sensing 58(8):1497-1517.

doi: 10.1080/15481603.2021.1999376


U.S. government work


Remote sensing has long been used to study wildlife; however, manual methods of detecting wildlife in aerial imagery are often time-consuming and prone to human error, and newer computer vision techniques have not yet been extensively applied to wildlife surveys. We used the object-based image analysis (OBIA) software eCognition to detect laughing gull (Leucophaeus atricilla) nests in Jamaica Bay as part of an ongoing monitoring effort at the John F. Kennedy International Airport. Our technique uses a combination of high resolution 4-band aerial imagery captured via manned aircraft with a multispectral UltraCam Falcon M2 camera, LiDAR point cloud data, and land cover data derived from a bathymetric LiDAR point cloud to classify and extract laughing gull nests. Our ruleset uses the site (topographic position of nest objects), tone (spectral characteristic of nest objects), shape, size, and association (nearby objects commonly found with the objects of interest that help identify them) elements of image interpretation, as well as NDVI and a sublevel object examination to classify and extract nests. The ruleset achieves a producer’s accuracy of 98% as well as a user’s accuracy of 65% and a kappa of 0.696, indicating that it extracts a majority of the nests in the imagery while reducing errors of commission to only 35% of the final results. The remaining errors of commission are difficult for the software to differentiate without also impacting the number of nests successfully extracted and are best addressed by a manual verification of output results as part of a semi-automated workflow in which the OBIA is used to complete the initial search of the imagery and the results are then systematically verified by the user to remove errors. This eliminates the need to manually search entire sets of imagery for nests, resulting in a much more efficient and less error prone methodology than previous unassisted image interpretation techniques. Because of the extensibility of OBIA software and the increasing availability of imagery due to small unmanned aircraft systems (sUAS), our methodology and its benefits have great potential for adaptation to other species surveyed using aerial imagery to enhance wildlife population monitoring.