Start Date
10-12-2017 12:00 AM
Description
Companies increasingly engage the crowd in the evaluation of a large pool of ideas to sift out the better ones among them. The crowd, however, seems to be better at eliminating the worst ideas than identifying the better ones. Using the anchoring effect as a treatment, and the decreasing effect on the variance of ratings, we develop an approach that enables the use of crowd evaluation for the identification of high quality ideas. To investigate whether our approach is effective, we conduct several experiments on a crowdworking-platform. Our results indicate that evaluating ideas of high quality represents a more challenging task for the crowd than evaluating those of low quality. Accordingly, idea quality moderates the effect of an anchor for idea evaluation. Our findings both extend the existing literature on crowd evaluation and offer practical solutions for how a crowd can be used to identify the most promising ideas.
Recommended Citation
Goerzen, Thomas and Kundisch, Dennis, "When in Doubt Follow the Crowd: How Idea Quality Moderates the Effect of an Anchor on Idea Evaluation" (2017). ICIS 2017 Proceedings. 7.
https://aisel.aisnet.org/icis2017/Peer-to-Peer/Presentations/7
When in Doubt Follow the Crowd: How Idea Quality Moderates the Effect of an Anchor on Idea Evaluation
Companies increasingly engage the crowd in the evaluation of a large pool of ideas to sift out the better ones among them. The crowd, however, seems to be better at eliminating the worst ideas than identifying the better ones. Using the anchoring effect as a treatment, and the decreasing effect on the variance of ratings, we develop an approach that enables the use of crowd evaluation for the identification of high quality ideas. To investigate whether our approach is effective, we conduct several experiments on a crowdworking-platform. Our results indicate that evaluating ideas of high quality represents a more challenging task for the crowd than evaluating those of low quality. Accordingly, idea quality moderates the effect of an anchor for idea evaluation. Our findings both extend the existing literature on crowd evaluation and offer practical solutions for how a crowd can be used to identify the most promising ideas.