Mapping Life – Quality Assessment of Novice vs. Expert Georeferencers

May 20, 2016

This week, the field of citizen science got another notch in its belt with the publication of the first issue of the new journal, Citizen Science Theory and Practice. Rick Bonney, Caren Cooper, and Heidi Ballard lay out their plans for this journal and how it fits into the larger field of citizen science in the introduction, The Theory and Practice of Citizen Science: Launching a New Journal. As Bonney et al. point out, the international community of citizen science practitioners has acknowledged that citizen science is here to stay with the establishment of the Citizen Science Association (CSA; citizenscience.org), the European Citizen Science Association (ECSA; ecsa.citizen-science.net), and the Australian Citizen Science Association (ACSA;citizenscience.org.au/), and now, this journal.

 

I'm thrilled to announce that our article, Mapping Life – Quality Assessment of Novice vs. Expert Georeferencers has been published in the first issue of the journal. In our article, we describe an experiment we conducted with citizen scientists georeferencing textual specimen localities using the online platform, GEOLocate. The experiment was multi-faceted and includes comparisons in accuracy between citizen scientists and experts, between georeferencing results of different types of specimens (fish and plants), and among several analyses for consensus of citizen science data. Read the Abstract below, and the full article here.

 

Abstract

The majority of the world’s billions of biodiversity specimens are tucked away in museum cabinets with only minimal, if any, digital records of the information they contain. Global efforts to digitize specimens are underway, yet the scale of the task is daunting. Fortunately, many activities associated with digitization do not require extensive training and could benefit from the involvement of citizen science participants. However, the quality of the data generated in this way is not well understood. With two experiments presented here, we examine the efficacy of citizen science participants in georeferencing specimen collection localities. In the absence of an online citizen science georeferencing platform and community, students served as a proxy for the larger citizen science population. At Tulane University and Florida State University, undergraduate students and experts used the GEOLocate platform to georeference fish and plant specimen localities, respectively. Our results provide a first-approximation of what can be expected from citizen science participants with minimal georeferencing training as a benchmark for future innovations. After outliers were removed, the range between student and expert georeferenced points was <1.0 to ca. 40.0 km for both the fish and the plant experiments, with an overall mean of 8.3 km and 4.4 km, respectively. Engaging students in the process improved results beyond GEOLocate’s algorithm alone. Calculation of a median point from replicate points improved results further, as did recognition of good georeferencers (e.g., creation of median points contributed by the best 50% of contributors). We provide recommendations for improving accuracy further. We call for the creation of an online citizen science georeferencing platform.

 

Ellwood, E R, Bart, H L, Jr, Doosey, M H, Jue, D K, Mann, J G, Nelson, G, Rios, N and Mast, A R 2016 Mapping Life – Quality Assessment of Novice vs. Expert Georeferencers. Citizen Science: Theory and Practice, 1(1): 4, pp. 1–12, DOI: http://dx.doi.org/10.5334/cstp.30

Please reload

oecologia ex machina

© 2027. All content by Libby Ellwood, unless stated otherwise.

  • scholar
  • 49051-researchgate-logo-icon-vector-icon-vector-eps
  • Black Twitter Icon