Deep Reinforcement Learning–Driven Energy Management for Electric Vehicles in 6G-Connected Smart Grids
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The large-scale electrification of transportation is fundamentally reshaping modern power systems. Although electric vehicles (EVs) significantly reduce greenhouse gas emissions and fossil fuel dependence, uncontrolled charging behavior introduces severe operational challenges, including peak load amplification, voltage instability, accelerated infrastructure degradation, and increased operational costs. Simultaneously, the growing penetration of intermittent renewable energy sources such as solar and wind introduces supply-side variability that further complicates real-time energy balancing. To address these coupled challenges, this paper proposes a comprehensive deep reinforcement learning (DRL)–driven energy management framework deployed within a 6G-enabled smart grid architecture. The proposed model formulates EV-grid coordination as a multi-objective optimization problem that jointly considers dynamic charging demand, renewable intermittency, time-varying electricity pricing, battery state-of-charge (SOC) constraints, degradation-aware charging penalties, and communication latency effects. A deep neural network–based policy agent is trained through continuous environment interaction to learn adaptive control strategies without relying on deterministic forecasting assumptions. Extensive simulations over a 24-hour operational horizon demonstrate that the proposed DRL framework significantly mitigates peak load intensity, enhances renewable energy utilization efficiency, reduces operational energy cost, and minimizes battery stress compared to rule-based and model predictive control strategies. The training convergence analysis confirms faster learning stability and superior cumulative reward performance relative to conventional reinforcement learning baselines. Furthermore, latency sensitivity evaluation highlights the importance of ultra-low-latency 6G communication in sustaining real-time distributed coordination performance. The results validate the scalability, robustness, and practical feasibility of DRL-enabled intelligent EV charging coordination, establishing a viable pathway toward sustainable, resilient, and economically optimized next-generation smart grid ecosystems.