This study is a replication of one of two studies found in “Beyond the Turk: Alternative platforms for crowdsourcing behavioral research” (Peer, Brandimarte, Samat, & Acquisti, 2017). We conduct an empirical analysis and comparison between two online crowdsourcing platforms, Amazon Mechanical Turk (MTurk) and Prolific Academic (ProA), as well as to a traditional student group. The online crowdsourcing platform (e.g., MTurk and others) used for years as a launching point for many types of microwork, including academic research. Today, MTurk has several competitors, including one that was built to focus on research tasks, ProA. Across the four segments, we reinforce the original study by finding both MTurk and ProA to provide inexpensive, reliable, and significantly faster methods of conducting surveys over traditional methods. Our results indicate that ProA provides superior service. By centering on research, ProA results are similar to MTurk’s. However, ProA’s response and completion rates, diversity, attention, naivety, reproducibility, and dishonest behavior are better.
Adams, Troy L.; Li, Yuanxia; and Liu, Hao
"A Replication of Beyond the Turk: Alternative Platforms for Crowdsourcing Behavioral Research – Sometimes Preferable to Student Groups,"
AIS Transactions on Replication Research: Vol. 6
, Article 15.
Available at: https://aisel.aisnet.org/trr/vol6/iss1/15
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.