PACIS 2021 Proceedings

Paper Type

FP

Paper Number

472

Abstract

Hybrid Intelligence is an emerging concept that emphasizes the complementary nature of human intelligence and artificial intelligence (AI). One key requirement for collaboration between humans and AI is the interpretability of the decisions provided by the AI to enable humans to assess whether to comply with the presented decisions. Due to the black-box nature of state-of-the-art AI, the explainable AI (XAI) research community has developed various means to increase interpretability. However, many studies show that increased interpretability through XAI does not necessarily result in complementary team performance (CTP). Through a structured literature review, we identify relevant factors that influence collaboration between humans and AI. Additionally, as we collect relevant research articles and synthesize their findings, we develop a research agenda with relevant hypotheses to lay the foundation for future research on human-AI complementarity in Hybrid Intelligence systems.

Share

COinS
 

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.