PACIS 2020 Proceedings

Loading...

Media is loading
 

Abstract

Crowdsourced reports of nonprofessional analysts published on online platforms enjoy increasing popularity. At the same time, it is reported that analyst reports of institutional equity research firms are becoming less influential. In this study, we aim to explain these two phenomena by comparing the information provision in texts of institutional and crowdsourced analyst reports. In our comparative analysis, we apply text mining techniques to evaluate and compare (i) readability and (ii) information density of more than 25,000 analyst reports. We find better readability for crowdsourced than for institutional analyst reports. Furthermore, the crowdsourced reports provide more information within the same length of text. With this study, we provide evidence for explaining the success of crowdsourced analyst reports. Based on these insights, we also provide established brokerage houses an indication of how they could improve their reports.

Share

COinS
 

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.