Paper Number
ICIS2025-1443
Paper Type
Short
Abstract
Despite low compensation, workers on microtask crowdsourcing platforms such as Amazon Mechanical Turk (MTurk) continue to participate, with earnings varying significantly across individuals. However, how microworkers perceive the fairness of pay remains unclear. This study integrates inductive discovery with deductive validation to examine the factors influencing worker perception of pay fairness. Collecting and analyzing 14,553 MTurk worker reviews using topic modeling, we identify seven key factors: generous pay, bonus opportunity, easy earnings, clear task instruction, time efficiency, requester responsiveness, and well-structured task. Aligning these factors with distributive, procedural, and interactional justice, as informed by organizational justice theory, introduces a novel approach to measuring justice and enables future research to empirically test their impacts on worker perception of pay fairness in crowdsourcing contexts. This research also offers practical implications for platform designers to improve worker satisfaction, while providing policymakers with insights into protective regulations in online labor markets.
Recommended Citation
Liu, Yiduo and Jiang, Ling, "Understanding Worker Perception of Pay Fairness on Microtask Crowdsourcing Platforms" (2025). ICIS 2025 Proceedings. 7.
https://aisel.aisnet.org/icis2025/is_transformwork/is_transformwork/7
Understanding Worker Perception of Pay Fairness on Microtask Crowdsourcing Platforms
Despite low compensation, workers on microtask crowdsourcing platforms such as Amazon Mechanical Turk (MTurk) continue to participate, with earnings varying significantly across individuals. However, how microworkers perceive the fairness of pay remains unclear. This study integrates inductive discovery with deductive validation to examine the factors influencing worker perception of pay fairness. Collecting and analyzing 14,553 MTurk worker reviews using topic modeling, we identify seven key factors: generous pay, bonus opportunity, easy earnings, clear task instruction, time efficiency, requester responsiveness, and well-structured task. Aligning these factors with distributive, procedural, and interactional justice, as informed by organizational justice theory, introduces a novel approach to measuring justice and enables future research to empirically test their impacts on worker perception of pay fairness in crowdsourcing contexts. This research also offers practical implications for platform designers to improve worker satisfaction, while providing policymakers with insights into protective regulations in online labor markets.
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.
Comments
03-Transformation