•  
  •  
 

Journal of Information Systems Education

Abstract

As Information Systems courses have become both more data-focused and student numbers have increased, there has emerged a greater need to assess technical and analytical skills more efficiently and effectively. Multiple-choice examinations provide a means for accomplishing this, though creating effective multiple-choice assessment items within a technical course context can be challenging. This study presents an iterative quality improvement framework based on Plan-Do-Study-Act (PDSA) quality assurance cycle for developing and improving such multiple-choice assessments. Integral to this framework, we also present a rigorous, reliable, and valid measure of assessment and item quality using discrimination efficiency and the KR-20 assessment reliability measure. We demonstrate the effectiveness of our approach across exams developed and administered for two courses — one, a highly technical Information Systems introductory course and the other, an introductory data analytics course. Using this approach, we show that assessment quality iteratively improves when instructors measure items and exams rigorously and apply this PDSA framework.

Share

COinS
 

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.