Energy-Aware Hybrid Decision Support System for Urban Traffic Signal Control: Multi-Agent Reinforcement Learning with Fuzzy Multi-Criteria IoT Routing

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Urban traffic congestion is posing significant chal- lenges to modern smart cities, which consuming excessive energy through both vehicular emissions and intelligent transporta- tion infrastructure. While multi-agent reinforcement learning (MARL) has shown promising results for adaptive traffic signal control, existing approaches are overlooking the substantial energy consumption of IoT sensing and communication networks that enable these systems. This paper is presenting a novel hybrid decision support system (DSS) that jointly optimizes traffic flow performance and sensing infrastructure energy efficiency. Our approach is integrating a MARL-based traffic signal controller with a fuzzy multi-criteria IoT routing layer that dynamically balances residual energy, hop count, link quality, and traffic load. Extensive experiments on CityFlowER benchmark scenarios (Hangzhou 4×4 and Jinan 3×4 road networks) are demonstrating that our hybrid DSS achieves 23.7% reduction in average travel time compared to fixed-time control while extending IoT network lifetime by 41.2% compared to conventional MARL with standard routing protocols. The system is exhibiting robust performance across varying traffic densities and maintains real- time operation which is suitable for deployment in large-scale urban environments.

Article activity feed