Abstract
As Large Language Models (LLMs) become part of everyday conversation, it is important for everyone to understand basic concepts of how language models work, so that it is not just a black box or a list of buzzwords. While a growing number of teaching resources are available for improving AI literacy, a spreadsheet-based approach to help business students understand the fundamental math and logic of LLMs is much needed, especially for those comfortable with spreadsheets. We introduce Attention in Excel, a series of spreadsheet-based exercises designed to help business students comprehend the essence of the self-attention mechanism, a core element of the Transformer model.
Recommended Citation
Chung, Tingting Rachel, "Attention in Excel" (2025). Proceedings of the 2025 Pre-ICIS SIGDSA Symposium. 17.
https://aisel.aisnet.org/sigdsa2025/17