Loading...
Abstract
Biosensors are invaluable tools in understanding the subconscious processes that influence human behavior and decision-making. For example, eye trackers can reveal where a person's attention is focused, while GSR devices can detect changes in skin conductance that indicate emotional arousal and facial expression analysis can reveal subtle emotional expressions. There is a growing need for business schools to prepare undergraduate students and provide them with the skills to understand and use biometric data effectively, ensuring they are equipped with skills to navigate the complexities of this new data type and to learn how to predict/explain human behavior. This TREO presents a use case of a newly created undergraduate course for students at the Quinlan School of Business at Loyola University Chicago that provides students with an understanding of the intersection of UX design and biometric data analysis. The course was designed and implemented by an Information Systems faculty member with experience in HCI research who is a founder and principal investigator in the UX & Biometrics Lab. The course was offered to business honors students as an elective course (classified as ‘engaged learning’) focused on research and data analysis (BH343 Analytical Decision Making). There were nineteen students enrolled (eight juniors and eleven seniors) with limited knowledge of experimental research and no knowledge of biometrics. The primary goal of this course was to equip students with the skills necessary to (i) articulate the value of UX in design, (ii) articulate the significance of human insights derived from biometric data, (iii) critically evaluate biometric research studies, (iv) design and execute biometric data-focused research study, and in the process, (v) analyze and interpret human insights gleaned from fundamental biometric data. To the best of our knowledge, there are no other undergraduate business courses fully focused on biometric data with a significant focus on hands-on, experiential experience for students. In a span of a semester (roughly 100 days), students were introduced to the basics of experimental research, types of physiological sensors, interpretation of resulting data, engaged in presenting eighteen academic articles, and were trained in the basics of online biometric data collection and study design (each student completed two studies using iMotions Online platform). In parallel, students formed seven research teams and completed study proposals for seven studies, obtained IRB approval for human subject research, designed studies, collected data by themselves using Quinlan’s UX & Biometrics lab’s iMotions human behavior platform, research-grade eye tracker, GSR device and/or AFFDEX facial expression automated coding. Each team collected data from 30 subjects per study on average and documented the project by writing full research manuscripts (30-40 pages that included an abstract, introduction, literature review, data analysis, discussion, and conclusions). Students are currently working on manuscript formatting and streamlining for submissions in leading academic journals and conferences. This course provided students with a competitive advantage as they enter the workforce or pursue advanced studies. By mastering these skills, students will differentiate themselves in the job market, particularly in industries where such skills are rare. Additionally, for those considering further education at the master's or PhD level (academia), this course will provide a solid foundation and a distinctive edge over other candidates, setting them apart as forward-thinkers and innovators in their field.
Paper Number
tpp1257
Recommended Citation
Bačić, Dinko, "Infusing Biosensors and Biometric Data into Business School Undergraduate Curriculum: Quinlan School of Business Case Study" (2024). AMCIS 2024 TREOs. 171.
https://aisel.aisnet.org/treos_amcis2024/171
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.