Jump to Content

Community-Driven Crowdsourcing: Data Collection with Local Developers

Christina Funk
Ravindran Rajakumar
Linne Ha
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), European Language Resources Association (ELRA), Miyazaki, Japan, pp. 1606-1609

Abstract

We tested the viability of partnering with local developers to create custom annotation applications and to recruit and motivate crowd contributors from their communities to perform an annotation task consisting of the assignment of toxicity ratings to Wikipedia comments. We discuss the background of the project, the design of the community-driven approach, the developers’ execution of their applications and crowdsourcing programs, and the quantity, quality, and cost of judgments, as well as the influence of each application’s design on the outcomes. The community-driven approach resulted in local developers successfully creating four unique tools and collecting labeled data of sufficiently high quantity and quality. The creative approaches to the task presentation and crowdsourcing program design drew upon developers’ local knowledge of their own social networks, who also reported interest in the underlying problem that the data collection addresses. We consider the lessons that may be drawn from this project for future iterations of the community-driven approach.