Accelerated Federated Learning Using Self-Adapting Bat Algorithm

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Federated learning (FL) is an advanced distributed machine learning (ML) framework designed to address issues related to data silos and data privacy. A significant challenge in FL is the non-independent and identically distributed (Non-IID) nature of client data, resulting in issues of slow convergence rate and low prediction accuracy for the model. To tackle these issues, we propose a FL scheme based on the bat algorithm (FedBat), leveraging the echolocation mechanism of bats to effectively balance global and local search capabilities and optimizing model weight updates through dynamic adjustments of the search strategy. FedBat also allows for adaptive parameter adjustments across various datasets. To mitigate the client drift issue, we extend FedBat by using Jensen-Shannon(JS) divergence to quantify the difference between local and global models. Clients decide whether to upload their local models based on this difference, aiming to enhance the global model's generalization capability and minimize communication overhead. Experimental results demonstrate that FedBat converges 5 times faster and enhances test accuracy by more than 40% compared to FedAvg. The extended FedBat effectively mitigates the decrease in the generalization performance of the global model and reduces communication costs by around 20%. Comparing FedPSO, FedGwo, and FedProx shows that FedBat demonstrates superior performance in terms of convergence rate and test accuracy. We derive the formula for the expected convergence rate of FedBat, analyze the impact of various parameters on FL performance, and establish the upper bound of FedBat to evaluate its model divergence.

Article activity feed