Abstract
Effective communication is essential to safe, patient-centred healthcare, yet many simulation-based education tools offer limited opportunities for students to practise verbal interaction with patients. This study addresses that gap by developing a bespoke mixed reality (MR) simulation that enables real-time, AI-driven verbal communication with a virtual patient. Designed specifically for this research and deployed on the Meta Quest 3 platform, the simulation supports the development of communication skills by allowing students to speak directly with a virtual patient and receive dynamic responses powered by large language models and natural language processing. The study introduces a novel theoretical model that integrates Task-Technology Fit and Flow Theory to examine how verbal interaction influences user satisfaction, perceived task alignment and continued use intention. A mixed-method, quasi-experimental design compares healthcare student experiences across two MR conditions: one featuring a non-verbal patient and the other incorporating AI-driven verbal interactivity. Participants will complete structured surveys adapted from validated instruments to assess fluency, absorption, enjoyment, task fit and satisfaction. Qualitative data from focus group discussions will support interpreting learner experience and system usability. This research contributes to designing scalable, interactive MR simulations for healthcare education and offers new insights into how AI-driven communication features affect learner experience and technology adoption. It also provides a foundation for further information systems research into aligning conversational AI with structured learning tasks and immersive environments.
Recommended Citation
Clark, David; Sorwar, Golam; and Naumann, Fiona, "AI-Driven Verbal Interaction with Virtual Patients in Mixed
Reality for Healthcare Education" (2025). ACIS 2025 Proceedings. 10.
https://aisel.aisnet.org/acis2025/10