Paper Number
1465
Paper Type
Complete Research Paper
Abstract
As electric vehicles (EVs) become more prevalent, demand for charging infrastructure is growing. This paper explores the operational management of electric vehicle charging hubs (EVCHs) with a focus on service pricing. Specifically, we design a data-driven decision support system for the EVCH operator that dynamically publishes multiple prices for capacity-based charging services. \re{In capacity-based services users pay based on the charging rate they choose.} The goal is to exploit heterogeneous user preferences, taking into account time-dependent and stochastic factors, by offering dynamic charging services. We address this problem by implementing reinforcement learning agents that learn optimal price setting policies through interaction with the EVCH environment without prior knowledge of user characteristics. Our findings indicate that the proposed pricing model outperforms existing benchmark policies. Furthermore, our analysis reveals that the system is responsive to exogenous factors, including electricity contracts and user behavior, and that our model successfully adapts to these changes.
Recommended Citation
Ahadi, Ramin; Schroer, Karsten; and Ketter, Wolfgang, "Managing Electric Vehicle Charging Hubs Through Dynamic Capacity-Based Pricing" (2024). ECIS 2024 Proceedings. 38.
https://aisel.aisnet.org/ecis2024/track17_greenis/track17_greenis/38
Managing Electric Vehicle Charging Hubs Through Dynamic Capacity-Based Pricing
As electric vehicles (EVs) become more prevalent, demand for charging infrastructure is growing. This paper explores the operational management of electric vehicle charging hubs (EVCHs) with a focus on service pricing. Specifically, we design a data-driven decision support system for the EVCH operator that dynamically publishes multiple prices for capacity-based charging services. \re{In capacity-based services users pay based on the charging rate they choose.} The goal is to exploit heterogeneous user preferences, taking into account time-dependent and stochastic factors, by offering dynamic charging services. We address this problem by implementing reinforcement learning agents that learn optimal price setting policies through interaction with the EVCH environment without prior knowledge of user characteristics. Our findings indicate that the proposed pricing model outperforms existing benchmark policies. Furthermore, our analysis reveals that the system is responsive to exogenous factors, including electricity contracts and user behavior, and that our model successfully adapts to these changes.
When commenting on articles, please be friendly, welcoming, respectful and abide by the AIS eLibrary Discussion Thread Code of Conduct posted here.