Abstract

Identifying competent reviewers who can contribute helpful reviews for online courses is an imperative task for online learning platforms and course instructors. Online learners value reviews of recent course experience, as such a proactive approach to soliciting and incentivizing reviews from competent reviewers is needed. This research proposes two performance-based reviewer competence indicators, i.e., reviewers’ prior helpfulness and specialization, and tests their predictive power on reviewers’ contributing helpful online course reviews. A sample of 36,825 reviews and their reviewers on a leading MOOC platform in China was analyzed. Results show that both reviewers’ prior helpfulness and specialization predict the helpfulness of their review contribution. The predictive power of reviewers’ prior helpfulness is susceptible to situational conditions, including learners/reviewers’ working status and course learning performance, whereas that of reviewer specialization is relatively consistent. This research enhances the understanding of performance-based reviewer competence assessment, particularly in the online learning context. Its results can be directly applied to identify potential reviewers and solicit helpful online course reviews.

Share

COinS