Loading...
Paper Number
ICIS2025-1405
Paper Type
Complete
Abstract
Firms use crowdsourcing contests to seek ideas from the crowd. As seekers could lack relevant expertise for the focal problem in contests (which is why many of them crowdsource), their idea evaluation can be affected by solvers’ ability cues. Drawing on the accessibility-diagnosticity framework, we theorize that seekers favor ideas from high-ability solvers, despite the corresponding idea quality is not necessarily higher, and propose approaches to mitigate the inherent bias. In Study A, utilizing field data from a crowdsourcing contest platform, we demonstrated that solvers’ ideas were evaluated more positively immediately after their status was upgraded, indicating seekers’ idea evaluation is influenced by solver ability. In Study B, we conducted an online experiment to test debiasing interventions. Results show that the effects of solver ability cues can be mitigated by reducing its diagnosticity. This research contributes to crowdsourcing contest literature and provides practical implications for the design of contest platforms.
Recommended Citation
Ye, Kai and Koh, Tat Koon, "Idea Evaluation Bias and Debiasing in Crowdsourcing Contests" (2025). ICIS 2025 Proceedings. 31.
https://aisel.aisnet.org/icis2025/sharing_econ/sharing_econ/31
Idea Evaluation Bias and Debiasing in Crowdsourcing Contests
Firms use crowdsourcing contests to seek ideas from the crowd. As seekers could lack relevant expertise for the focal problem in contests (which is why many of them crowdsource), their idea evaluation can be affected by solvers’ ability cues. Drawing on the accessibility-diagnosticity framework, we theorize that seekers favor ideas from high-ability solvers, despite the corresponding idea quality is not necessarily higher, and propose approaches to mitigate the inherent bias. In Study A, utilizing field data from a crowdsourcing contest platform, we demonstrated that solvers’ ideas were evaluated more positively immediately after their status was upgraded, indicating seekers’ idea evaluation is influenced by solver ability. In Study B, we conducted an online experiment to test debiasing interventions. Results show that the effects of solver ability cues can be mitigated by reducing its diagnosticity. This research contributes to crowdsourcing contest literature and provides practical implications for the design of contest platforms.
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.
Comments
Sharing Economy