Paper Number
ICIS2025-2070
Paper Type
Complete
Abstract
Large-scale pre-trained foundation models are transforming AI model development. We theorize the release of open-source foundation models as a commoditization shock, which lowers the barrier for developers to build applications on top of these foundation model. Using the release of Meta's Llama 2 on the Hugging Face platform as a quasi-experiment, we employ a difference-in-differences approach comparing text generation model developers (treatment) with non-text generation developers (control). We find that following the release of Llama 2, text-generation model developers shift their efforts from commoditized text generation to less commoditized non-text-generation domains. Experienced developers exhibit strategic flexibility by reallocating their efforts and narrowing their scope of focus to maintain differentiation. Less experienced developers engage in broader exploration across various downstream applications, benefiting from lower entry barriers. These findings reveal how foundation models as general purpose technologies commoditize AI development and reshape the competitive dynamics in AI development ecosystems.
Recommended Citation
Wu, David; Lin, Jinan; and Li, Zhuoxin (Allen), "Foundation Models and AI Innovation: Evidence from the Hugging Face Platform" (2025). ICIS 2025 Proceedings. 15.
https://aisel.aisnet.org/icis2025/digitstrategy/digitstrategy/15
Foundation Models and AI Innovation: Evidence from the Hugging Face Platform
Large-scale pre-trained foundation models are transforming AI model development. We theorize the release of open-source foundation models as a commoditization shock, which lowers the barrier for developers to build applications on top of these foundation model. Using the release of Meta's Llama 2 on the Hugging Face platform as a quasi-experiment, we employ a difference-in-differences approach comparing text generation model developers (treatment) with non-text generation developers (control). We find that following the release of Llama 2, text-generation model developers shift their efforts from commoditized text generation to less commoditized non-text-generation domains. Experienced developers exhibit strategic flexibility by reallocating their efforts and narrowing their scope of focus to maintain differentiation. Less experienced developers engage in broader exploration across various downstream applications, benefiting from lower entry barriers. These findings reveal how foundation models as general purpose technologies commoditize AI development and reshape the competitive dynamics in AI development ecosystems.
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.
Comments
18-Strategy