Location
Grand Wailea, Hawaii
Event Website
https://hicss.hawaii.edu/
Start Date
8-1-2019 12:00 AM
End Date
11-1-2019 12:00 AM
Description
Objective: We present a Teaching-as-Research project that implements a new intervention in a flipped software engineering course over two semesters. The short-term objective of the intervention was to improve students’ preparedness for live sessions. The long-term objective was to improve their knowledge retention evaluated in time-separated high-stakes assessments. Intervention: The intervention involved adding weekly low-stakes just-in-time assessments to course modules to motivate students to review assigned instructional materials in a timely manner. The assessments consisted of, per course module, two preparatory quizzes embedded within off-class instructional materials and a non-embedded in-class quiz. Method: Embedded assessments were deployed to two subgroups of students in an alternating manner. In-class assessments were deployed to all students. The impact of embedded assessments on in-class assessments and on final exam performance was measured. Results: Embedded assessments improved students’ preparedness for live sessions. The effect was statistically significant, but variable. Embedded assessments did not impact long-term knowledge retention assessed on final exam. We have decided to keep the intervention and deploy it to all students in the future.
Introducing Low-Stakes Just-in-Time Assessments to a Flipped Software Engineering Course
Grand Wailea, Hawaii
Objective: We present a Teaching-as-Research project that implements a new intervention in a flipped software engineering course over two semesters. The short-term objective of the intervention was to improve students’ preparedness for live sessions. The long-term objective was to improve their knowledge retention evaluated in time-separated high-stakes assessments. Intervention: The intervention involved adding weekly low-stakes just-in-time assessments to course modules to motivate students to review assigned instructional materials in a timely manner. The assessments consisted of, per course module, two preparatory quizzes embedded within off-class instructional materials and a non-embedded in-class quiz. Method: Embedded assessments were deployed to two subgroups of students in an alternating manner. In-class assessments were deployed to all students. The impact of embedded assessments on in-class assessments and on final exam performance was measured. Results: Embedded assessments improved students’ preparedness for live sessions. The effect was statistically significant, but variable. Embedded assessments did not impact long-term knowledge retention assessed on final exam. We have decided to keep the intervention and deploy it to all students in the future.
https://aisel.aisnet.org/hicss-52/set/measurement_assessment/4