Abstract
Explainable artificial intelligence (XAI) plays an important role in building trust in AI-driven decisions across various industries. However, the specific requirements for explainability can vary significantly from one sector to another. In this study, we investigated users' requirements for AI explainability across six sectors—logistics, healthcare, education, construction, agriculture, and manufacturing—to develop a cross-sector framework for XAI. We employed co-creation as a methodology and conducted a workshop involving diverse stakeholders to identify their needs and preferences regarding AI explainability. The thematic analysis of the workshop data revealed that stakeholders universally emphasized the importance of transparency, contextual accuracy, personalization, and user control in AI explanations. Moreover, our study highlighted sector-specific differences that influence explainability preferences. The proposed framework offers a balance between sector-specific adaptability and the foundational principles of XAI, providing actionable guidelines and insights for both researchers and practitioners.
Recommended Citation
Khan, Arsalan; Wolff, Annika; and Islam, Najmul, "A Cross-Sector Framework For Explainable Artificial Intelligence (XAI) Through Co-Creation" (2025). 16th Scandinavian Conference on Information Systems. 10.
https://aisel.aisnet.org/scis2025/10