A Robust Federated Learning Method for Data Heterogeneity with Enhanced Momentum-guided Aggregation

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Federated learning (FL) is an emerging distributed machine learning paradigm that enables multiple edge devices to collaboratively train a model for a specific task while preserving privacy. Yet, due to the non-independent and identically distributed (Non-IID) data dispersed on edge devices, FL suffers from slow convergence and low accuracy. This paper focuses on this problem, and proposes a robust method through enhanced data sharing. Specifically, this paper adopts feature distillation to obtain the performance sensitive features, which are used to generating proxy data for initializing public data before FL training. Meanwhile, We use a momentum-guided strategy for parameter aggregation. In order to evaluate the performance of the method, this paper also conducts many experiments. As demonstrated by the results, the method could outperform the state-of-the-art methods by 5.31% in terms of performance.

Article activity feed