Business & Information Systems Engineering
Document Type
Research Paper
Abstract
The most promising standard machine learning methods can deliver highly accurate classification results, often outperforming standard white-box methods. However, it is hardly possible for humans to fully understand the rationale behind the black-box results, and thus, these powerful methods hamper the creation of new knowledge on the part of humans and the broader acceptance of this technology. Explainable Artificial Intelligence attempts to overcome this problem by making the results more interpretable, while Interactive Machine Learning integrates humans into the process of insight discovery. The paper builds on recent successes in combining these two cuttingedge technologies and proposes how Explanatory Interactive Machine Learning (XIL) is embedded in a generalizable Action Design Research (ADR) process – called XILADR. This approach can be used to analyze data, inspect models, and iteratively improve them. The paper shows the application of this process using the diagnosis of viral pneumonia, e.g., Covid-19, as an illustrative example. By these means, the paper also illustrates how XIL-ADR can help identify shortcomings of standard machine learning projects, gain new insights on the part of the human user, and thereby can help to unlock the full potential of AIbased systems for organizations and research.
Recommended Citation
Pfeuffer, Nicolas; Baum, Lorenz; Stammer, Wolfgang; Abdel-Karim, Benjamin M.; Schramowski, Patrick; Bucher, Andreas M.; Hügel, Christian; Rohde, Gernot; Kersting, Kristian; and Hinz, Oliver
(2023)
"Explanatory Interactive Machine Learning - Establishing an Action Design Research Process for Machine Learning Projects,"
Business & Information Systems Engineering:
Vol. 65: Iss. 6, 677-701.
Available at:
https://aisel.aisnet.org/bise/vol65/iss6/4