Smart Pricing for Smart Charging: A Deep Reinforcement Learning Framework for Residential EV Infrastructure

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The increasing adoption of electric vehicles in residential buildings creates challenges for charging infrastructure management, particularly in pricing services to balance revenue, user satisfaction, and grid stability. Traditional pricing methods, such as fixed rates and time-of-use tariffs, cannot adapt to the dynamic nature of charging demand. We propose a reinforcement learning framework for dynamic pricing of residential EV charging stations. The framework formulates the pricing problem as a Markov decision process and employs proximal policy optimization to learn a pricing policy based on real-time conditions. The state representation includes ten features covering temporal indicators, charging loads, grid status, traffic, and weather. A multi-objective reward function balances revenue, station utilization, grid stability, and user satisfaction. The system is trained on 6878 charging sessions from a residential complex in Trondheim, Norway. Compared with fixed pricing and time-of-use pricing, the proposed method achieves an overall score of 0.569, representing improvements of 32.9% and 48.9%, respectively. Sensitivity analysis confirms that the model remains robust across different demand response assumptions. The main contributions include a custom reinforcement learning environment for residential EV charging and empirical evidence that learned policies outperform traditional pricing approaches.

Article activity feed