TY - JOUR
T1 - Mutual Knowledge Distillation-Based Communication Optimization Method for Cross-Organizational Federated Learning
AU - Liu, Su
AU - Shen, Hong
AU - Law, Eddie K.L.
AU - Lam, Chan Tong
N1 - Publisher Copyright:
© 2025 by the authors.
PY - 2025/5
Y1 - 2025/5
N2 - With the increasing severity of data privacy and security issues, cross-organizational federated learning is facing challenges in communication efficiency and cost. Knowledge distillation, as an effective model compression technique, can reduce model size without significantly compromising accuracy, thereby lowering communication overhead. However, existing knowledge distillation methods either employ static distillation loss weights, ignoring bandwidth variations in communication networks, or fail to effectively account for bandwidth heterogeneity among different nodes, leading to communication bottlenecks. To enhance the overall system efficiency, there is an urgent need to find new methods that enable models to achieve optimal performance in resource-constrained environments. This paper proposes a communication optimization method based on mutual knowledge distillation (Fed-MKD) to address the bottleneck issues caused by high communication costs in cross-organizational federated learning. By leveraging a mutual distillation mechanism, Fed-MKD enables collaborative training of teacher and student models locally while reducing the frequency and size of global model transmissions to optimize communication. Our experimental results demonstrate that, compared to classical knowledge distillation methods, Fed-MKD significantly improves communication efficiency, with compression ratios ranging from 4.89× to 28.45×. Furthermore, Fed-MKD achieves up to 4.34× acceleration in convergence time across multiple datasets. These findings highlight the significant practical value of Fed-MKD in environments with heterogeneous data distributions and limited communication resources.
AB - With the increasing severity of data privacy and security issues, cross-organizational federated learning is facing challenges in communication efficiency and cost. Knowledge distillation, as an effective model compression technique, can reduce model size without significantly compromising accuracy, thereby lowering communication overhead. However, existing knowledge distillation methods either employ static distillation loss weights, ignoring bandwidth variations in communication networks, or fail to effectively account for bandwidth heterogeneity among different nodes, leading to communication bottlenecks. To enhance the overall system efficiency, there is an urgent need to find new methods that enable models to achieve optimal performance in resource-constrained environments. This paper proposes a communication optimization method based on mutual knowledge distillation (Fed-MKD) to address the bottleneck issues caused by high communication costs in cross-organizational federated learning. By leveraging a mutual distillation mechanism, Fed-MKD enables collaborative training of teacher and student models locally while reducing the frequency and size of global model transmissions to optimize communication. Our experimental results demonstrate that, compared to classical knowledge distillation methods, Fed-MKD significantly improves communication efficiency, with compression ratios ranging from 4.89× to 28.45×. Furthermore, Fed-MKD achieves up to 4.34× acceleration in convergence time across multiple datasets. These findings highlight the significant practical value of Fed-MKD in environments with heterogeneous data distributions and limited communication resources.
KW - communication optimization
KW - cross-organizational federated learning
KW - data heterogeneity
KW - integrated distillation
KW - knowledge distillation
UR - https://www.scopus.com/pages/publications/105004829157
U2 - 10.3390/electronics14091784
DO - 10.3390/electronics14091784
M3 - Article
AN - SCOPUS:105004829157
SN - 2079-9292
VL - 14
JO - Electronics (Switzerland)
JF - Electronics (Switzerland)
IS - 9
M1 - 1784
ER -