Crowdsourced Text Transcription on the Zooniverse Platform Models for Design, Development and Evaluation

1. Abstract

This paper will discuss the steps taken by the team behind the Zooniverse.org crowdsourcing platform to address the relationship between task design and project outcomes, including a range of approaches to text-based data collection, volunteer engagement, and evaluation of methods alongside results. By focusing on two 'generations' worth of online, crowdsourced text transcription projects, the presenters will show how the Zooniverse team has used the successes--and failures--of previous projects as the basis for newly-developed methods of data collection. The presentation will focus on two projects launched in 2015 and their results, and will show how the methods used by those projects inspired the creation of an A/B experiment that ran in 2018. The authors will discuss the experiment results, and how the results affected the tool development for the platform that is currently underway.

Samantha Nicole Blickhan (samantha@zooniverse.org), The Adler Planetarium, United States of America and Victoria Anne Van Hyning (vvanhyning@loc.gov), Library of Congress

Theme: Lux by Bootswatch.