Frustration and ennui among Amazon MTurk workers

Loading...
Thumbnail Image

Date

2023-09

DOI

Open Access Location

Journal Title

Journal ISSN

Volume Title

Publisher

Springer Nature on behalf of the Psychonomic Society, Inc

Rights

(c) The author/s
Open Access This article is licensed under a Creative Commons Attri- bution 4.0 International License, which permits use, sharing, adapta- tion, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
CC BY

Abstract

Academics are increasingly turning to crowdsourcing platforms to recruit research participants. Their endeavors have benefited from a proliferation of studies attesting to the quality of crowdsourced data or offering guidance on managing specific challenges associated with doing crowdsourced research. Thus far, however, relatively little is known about what it is like to be a participant in crowdsourced research. Our analysis of almost 1400 free-text responses provides insight into the frustrations encountered by workers on one widely used crowdsourcing site: Amazon's MTurk. Some of these frustrations stem from inherent limitations of the MTurk platform and cannot easily be addressed by researchers. Many others, however, concern factors that are directly controllable by researchers and that may also be relevant for researchers using other crowdsourcing platforms such as Prolific or CrowdFlower. Based on participants' accounts of their experiences as crowdsource workers, we offer recommendations researchers might consider as they seek to design online studies that demonstrate consideration for respondents and respect for their time, effort, and dignity.

Description

Keywords

Crowdsourcing, Digital methods, Ethics, Internet, Job satisfaction, Online research, Participants, Humans, Frustration, Crowdsourcing, Research Personnel

Citation

Fowler C, Jiao J, Pitts M. (2023). Frustration and ennui among Amazon MTurk workers.. Behav Res Methods. 55. 6. (pp. 3009-3025).

Collections

Endorsement

Review

Supplemented By

Referenced By

Creative Commons license

Except where otherwised noted, this item's license is described as (c) The author/s