Location
Online
Event Website
https://hicss.hawaii.edu/
Start Date
4-1-2021 12:00 AM
End Date
9-1-2021 12:00 AM
Description
Crowdsourcing contests provide an effective way to elicit novel ideas and creative solutions from collective intelligence. A key design feature of crowdsourcing contests is the competition between contest participants to complete a specific task with financial awards to the winner(s). In recent years, some crowdsourcing contest platforms provide options to contest participants for solution sharing during the competition. This study intends to evaluate the influence of exposure to shared solutions on different stakeholders, including the team, and the requester. Our study employs a multiple-level panel data from a large online crowdsourcing platform, Kaggle.com, to examine these effects. For teams, exposure to shared solutions helps new entrant teams to jump-start and help teams to achieve better performance in the subsequent submissions, and the teams’ skill level negatively moderates these positive effects. For requesters, allowing solution sharing has both benefits and costs in terms of improving the best performance of the crowd. We highlight the theoretical implications of the study and provide practical suggestions for crowdsourcing contest platforms to help them decide whether to allow solution sharing during the competition.
Does Exposure to Shared Solutions Lead to Better Outcomes? An Empirical Investigation in Online Crowdsourcing Contests
Online
Crowdsourcing contests provide an effective way to elicit novel ideas and creative solutions from collective intelligence. A key design feature of crowdsourcing contests is the competition between contest participants to complete a specific task with financial awards to the winner(s). In recent years, some crowdsourcing contest platforms provide options to contest participants for solution sharing during the competition. This study intends to evaluate the influence of exposure to shared solutions on different stakeholders, including the team, and the requester. Our study employs a multiple-level panel data from a large online crowdsourcing platform, Kaggle.com, to examine these effects. For teams, exposure to shared solutions helps new entrant teams to jump-start and help teams to achieve better performance in the subsequent submissions, and the teams’ skill level negatively moderates these positive effects. For requesters, allowing solution sharing has both benefits and costs in terms of improving the best performance of the crowd. We highlight the theoretical implications of the study and provide practical suggestions for crowdsourcing contest platforms to help them decide whether to allow solution sharing during the competition.
https://aisel.aisnet.org/hicss-54/os/sites/7