Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
How many crowdsourced workers should a requester hire?
Carvalho A., Dimitrov S., Larson K. Annals of Mathematics and Artificial Intelligence78 (1):45-72,2016.Type:Article
Date Reviewed: Jun 22 2017

How many crowdsourced workers do you optimally need to ensure an acceptable quality in the (averaged) results of outsourced tasks? The surprising answer is ten to 11, as this paper proves in a lucid and instructive way, for several tasks involving human judgment.

When requesters distribute small tasks to crowd workers for very small fees, typically around $0.05, they usually give one and the same task to many workers. This, and a final averaging over the reported results of all crowd workers, should compensate for their unknown levels of quality and expertise.

The authors conducted three different experiments with 50 crowd workers from Amazon Mechanical Turk for two classification tasks (assessing a poem in terms of grammar, correctness, and relevance) and a prediction task (predicting the outcome of two National Basketball Association (NBA) games) under different incentive (that is, payment) schemes. Based on an elaborate random sampling and mixing of the results of these 50 workers, and a parameter-free estimation, they conclusively demonstrate that after ten to 11 workers, the residual error versus the gold standard (here: an assessment of the poems by university professors and the actual outcomes of the games) no longer shows any noticeable improvements.

As a corollary, they note a potential limitation of crowdsourcing: even when using a lot of crowd workers and averaging over their outputs, the residual error does not seem to converge to the true value. The paper also abounds with additional interesting results on how to optimize crowdsourcing under certain conditions (for example, if you happen to have some indication of the quality of the crowd workers, then already four workers are able to yield almost perfect results).

Everything is arranged in an extraordinarily clear and readable style (including a very good introduction to the topic and lots of references), with many accompanying (and highly informative) graphs and all technical details in an appendix. Therefore, I unconditionally recommend this paper to everyone in the field of task assignment and worker management.

Reviewer:  Christoph F. Strnadl Review #: CR145365 (1711-0759)
Bookmark and Share
  Featured Reviewer  
 
The Computer Industry (K.1 )
 
 
Codes Of Good Practice (K.7.m ... )
 
 
Employment (K.4.3 ... )
 
Would you recommend this review?
yes
no
Other reviews under "The Computer Industry": Date
Keeping the U.S. computer industry competitive
, National Academy Press, Washington, DC, 1992. Type: Book (9780309045445)
Jun 1 1993
Three degrees above zero: Bell Labs in the information age
Bernstein J., Charles Scribner’s Sons, New York, NY, 1984. Type: Book (9780684181707)
Mar 1 1985
Strategies for electronics test
Pynn C., McGraw-Hill, Inc., New York, NY, 1986. Type: Book (9789780070509979)
Dec 1 1987
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy