Enhancing Federated Learning Robustness in Non-IID Data Environments via MMD-Based Distribution Alignment

Xiao Ma, Hong Shen, Wenqi Lyu, Wei Ke

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Federated learning(FL), due to its distributed nature, is highly susceptible to malicious attacks. Although various Byzantine-robust FL methods exist, they often fail to maintain robustness in practical scenarios due to the non-independent and identically distributed (Non-IID) nature of client data. Moreover, existing FL methods often suffer from weight divergence caused by heterogeneous data distributions across clients. To address these issues, we propose a novel federated learning framework that aligns local data distributions across different clients to enhance robustness for Non-IID data in adversarial environments. It contains a feature transformation layer that incorporates Maximum Mean Discrepancy (MMD) as a regularization term to avoid weight divergence through aligning local and global data distributions without sharing raw data. Our approach dynamically updates the statistical information of both local and global data, including the mean and variance, ensuring that local models are closely aligned with the global model throughout training. Experimental results on MNIST and CIFAR-10 datasets demonstrate that our proposed framework significantly improves robustness both in the absence of attacks and against untargeted attacks such as sign-flipping and additive noise.

Original languageEnglish
Title of host publicationParallel and Distributed Computing, Applications and Technologies - 25th International Conference, PDCAT 2024, Proceedings
EditorsYupeng Li, Jianliang Xu, Yong Zhang
PublisherSpringer Science and Business Media Deutschland GmbH
Pages280-291
Number of pages12
ISBN (Print)9789819642069
DOIs
Publication statusPublished - 2025
Event25th International Conference on Parallel and Distributed Computing, Applications and Technologies, PDCAT 2024 - Hong Kong, China
Duration: 13 Dec 202415 Dec 2024

Publication series

NameLecture Notes in Computer Science
Volume15502 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference25th International Conference on Parallel and Distributed Computing, Applications and Technologies, PDCAT 2024
Country/TerritoryChina
CityHong Kong
Period13/12/2415/12/24

Keywords

  • Federated learning
  • Maximum Mean Discrepancy
  • Non-IID
  • Robustness

Fingerprint

Dive into the research topics of 'Enhancing Federated Learning Robustness in Non-IID Data Environments via MMD-Based Distribution Alignment'. Together they form a unique fingerprint.

Cite this