Journal of Information Technology
Document Type
Other
Abstract
This study advances the field of Computationally Intensive Theory Development (CTD) by examining the capabilities of Explainable Artificial Intelligence (XAI), in particular SHapley Additive exPlanations (SHAP), for theory development, while providing guidelines for this process. We evaluate SHAP’s methodological abilities and develop a structured approach for using SHAP to harness insights from black-box predictive models. For this purpose, we leverage a dual-methodological approach. First, to assess SHAP’s capabilities in uncovering patterns that shape a phenomenon, we conduct a Monte-Carlo simulation study. Second, to illustrate and guide the theory development process with SHAP for CTD, we apply SHAP in a use-case using real-world data. Based on these analyses, we propose a stepwise uniform and replicable approach giving guidance that can benefit rigorous theory development and increase the traceability of the theorizing process. With our structured approach, we contribute to the use of XAI approaches in research and, by uncovering patterns in black-box prediction models, add to the ongoing search for next-generation theorizing methods in the field of Information Systems (IS).
DOI
10.1177/02683962241289597
Recommended Citation
Stoffels, Dominik; Faltermaier, Stefan; Strunk, Kim Simon; and Fiedler, Marina
(2025)
"Guiding computationally intensive theory development with explainable artificial intelligence: The case of shapley additive explanations,"
Journal of Information Technology: Vol. 40:
Iss.
2, Article 5.
DOI: 10.1177/02683962241289597
Available at:
https://aisel.aisnet.org/jit/vol40/iss2/5