Paper Number

1669

Abstract

Algorithmic forecasts outperform human forecasts in many tasks. State-of-the-art machine learning (ML) algorithms have even widened that gap. Since sales forecasting plays a key role in business profitability, ML based sales forecasting can have significant advantages. However, individuals are resistant to use algorithmic forecasts. To overcome this algorithm aversion, explainable AI (XAI), where an explanation interface (XI) provides model predictions and explanations to the user, can help. However, current XAI techniques are incomprehensible for laymen. Despite the economic relevance of sales forecasting, there is no significant research effort towards aiding non-expert users make better decisions using ML forecasting systems by designing appropriate XI. We contribute to this research gap by designing a model-agnostic XI for laymen. We propose a design theory for XIs, instantiate our theory and report initial formative evaluation results. A real-world evaluation context is used: A medium-sized Swiss bakery chain provides past sales data and human forecasts.

Share

COinS
 

When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.