Journal of Information Systems Education


Successful development of an information system to solve a business problem depends on the analyst’s ability to elicit system requirements from a user. This complex competency could be trained via critical peer evaluation of the requirements elicitation (RE) interviews. In this study, 294 students across four pre-pandemic and two COVID-19 pandemic-affected semesters evaluated recorded sample RE interviews of low and high quality. A piecewise regression modeling was used to examine the change in students’ evaluations separately for the pre-pandemic and pandemic-affected semesters. Current results showed that students exhibited inflated evaluation scores (relative to instructors’ scores) for the high-quality, but not for the low-quality interview. While students’ evaluations for the low-quality interview remained stable across the pre-pandemic semesters, a significant decrease in evaluation scores for the high-quality interview reduced the gap between the students’ and instructors’ evaluations. The onset of the COVID-19 pandemic brought a significant increase in students’ evaluation scores, which decreased during the second pandemic-affected semester. Moreover, females inflated their evaluations compared to males, specifically for technical, rather than soft skills. Current findings shed light on several important trends in students’ peer evaluations in the context of RE training and possible effects of massive learning disruptions, such as the pandemic.



When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.