Paper Number
2719
Paper Type
Complete
Abstract
Explainable Artificial Intelligence (XAI) can contribute to the idea of AI being an instrument for reflection when used for augmentation of human decision-making. In the educational domain, reflective decision-making is crucial as decisions have a meaningful and long-term impact. Against this background, we propose an XAI-based approach that supports users in making reflective educational decisions. Our approach introduces three main ideas: concepts as a “shared language” between AI and users, concept-based explanations, and concept-based interventions. We demonstrate the practical applicability of our approach for a real-world dataset with university courses. We evaluate the efficacy of our approach in a user study with 495 participants. Results suggest that our novel approach effectively supports users in making reflective decisions compared to black box recommender systems, while increasing users’ exploration, self-reflection, confidence, and trust. The effectiveness of our approach is attributable to the combination of concept-based explanations and the opportunity to intervene.
Recommended Citation
Förster, Maximilian; Schröppel, Philipp; Schwenke, Chiara; Fink, Lior; and Klier, Mathias, "Choose Wisely: Leveraging Explainable AI to Support Reflective Decision-Making" (2024). ICIS 2024 Proceedings. 22.
https://aisel.aisnet.org/icis2024/aiinbus/aiinbus/22
Choose Wisely: Leveraging Explainable AI to Support Reflective Decision-Making
Explainable Artificial Intelligence (XAI) can contribute to the idea of AI being an instrument for reflection when used for augmentation of human decision-making. In the educational domain, reflective decision-making is crucial as decisions have a meaningful and long-term impact. Against this background, we propose an XAI-based approach that supports users in making reflective educational decisions. Our approach introduces three main ideas: concepts as a “shared language” between AI and users, concept-based explanations, and concept-based interventions. We demonstrate the practical applicability of our approach for a real-world dataset with university courses. We evaluate the efficacy of our approach in a user study with 495 participants. Results suggest that our novel approach effectively supports users in making reflective decisions compared to black box recommender systems, while increasing users’ exploration, self-reflection, confidence, and trust. The effectiveness of our approach is attributable to the combination of concept-based explanations and the opportunity to intervene.
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.
Comments
10-AI