Hierarchical Split Federated Learning: Convergence Analysis and System Optimization

  • Zheng Lin
  • , Wei Wei
  • , Zhe Chen
  • , Chan Tong Lam
  • , Xianhao Chen
  • , Yue Gao
  • , Jun Luo

Research output: Contribution to journalArticlepeer-review

18 Citations (Scopus)

Abstract

As AI models expand in size, it has become increasingly challenging to deploy federated learning (FL) on resource-constrained edge devices. To tackle this issue, split federated learning (SFL) has emerged as an FL framework with reduced workload on edge devices via model splitting; it has received extensive attention from the research community in recent years. Nevertheless, most prior works on SFL focus only on a two-tier architecture without harnessing multi-tier cloud-edge computing resources. In this paper, we intend to analyze and optimize the learning performance of SFL under multi-tier systems. Specifically, we propose the hierarchical SFL (HSFL) framework and derive its convergence bound. Based on the theoretical results, we formulate a joint optimization problem for model splitting (MS) and model aggregation (MA). To solve this rather hard problem, we then decompose it into MS and MA sub-problems that can be solved via an iterative descending algorithm. Simulation results demonstrate that the tailored algorithm can effectively optimize MS and MA in multi-tier systems and significantly outperform existing schemes.

Original languageEnglish
Pages (from-to)9352-9367
Number of pages16
JournalIEEE Transactions on Mobile Computing
Volume24
Issue number10
DOIs
Publication statusPublished - 2025

Keywords

  • Distributed learning
  • edge computing
  • hierarchical split federated learning
  • model aggregation
  • model splitting

Fingerprint

Dive into the research topics of 'Hierarchical Split Federated Learning: Convergence Analysis and System Optimization'. Together they form a unique fingerprint.

Cite this