Start Date
11-8-2016
Description
Idea evaluation in innovation processes is typically performed by experts. But as experts are often hard to find and costly, the relatively new phenomenon of crowdvoting may offer a potentially attractive alternative. In this paper we investigate whether the evaluation of a large number of ideas by a crowd can match that of experts and whether, ultimately, crowdvoting has the potential to substitute experts. Eighty business model ideas generated in a class-room experiment are used to compare the results of evaluations by experts with those of a crowd generated via a real-life online crowdvoting-platform. Applying a separate and absolute assessment scale for each idea, our differentiated analysis indicates that an anonymous online crowd cannot evaluate business models to the same level as experts. Our study contributes new theoretical insights into research on crowd evaluations and highlights the latter’s practical potential as well as limitations.
Recommended Citation
Goerzen, Thomas and Kundisch, Dennis, "Can the Crowd Substitute Experts in Evaluation of Creative Ideas? An Experimental Study Using Business Models" (2016). AMCIS 2016 Proceedings. 10.
https://aisel.aisnet.org/amcis2016/Virtual/Presentations/10
Can the Crowd Substitute Experts in Evaluation of Creative Ideas? An Experimental Study Using Business Models
Idea evaluation in innovation processes is typically performed by experts. But as experts are often hard to find and costly, the relatively new phenomenon of crowdvoting may offer a potentially attractive alternative. In this paper we investigate whether the evaluation of a large number of ideas by a crowd can match that of experts and whether, ultimately, crowdvoting has the potential to substitute experts. Eighty business model ideas generated in a class-room experiment are used to compare the results of evaluations by experts with those of a crowd generated via a real-life online crowdvoting-platform. Applying a separate and absolute assessment scale for each idea, our differentiated analysis indicates that an anonymous online crowd cannot evaluate business models to the same level as experts. Our study contributes new theoretical insights into research on crowd evaluations and highlights the latter’s practical potential as well as limitations.