Communication-Efficient Federated Learning via Predictive Coding

1年前 ⋅ 492 阅读

Communication-Efficient Federated Learning via Predictive Coding

作者: Zhang, Chenhan;Zhang, Shiyao;Yu, Shui;Yu, James J.Q.
会议: IEEE Wireless Communications and Networking Conference, WCNC(2022年影响因子/JCR分区:7.695/Q1)

Abstract: The existing Federated Learning (FL) systems encounter an enormous communication overhead when employing GNN-based models for traffic forecasting tasks since these models commonly incorporate enormous number of parameters to be transmitted in the FL systems. In this paper, we propose a FL framework, namely, C lustering-based hierarchical and T wo-step- optimized FL (CTFL), to overcome this practical problem. CTFL employs a divide-and-conquer strategy, clustering clients based on the closeness of their local model parameters. Furthermore, we incorporate the particle swarm optimization algorithm in CTFL, which employs a two-step strategy for optimizing local models. This technique enables the central server to upload only one representative local model update from each cluster, thus reducing the communication overhead associated with model update transmission in the FL. Comprehensive case studies on two real-world datasets and two state-of-the-art GNN-based models demonstrate the proposed framework's outstanding training efficiency and prediction accuracy, and the hyperparameter sensitivity of CTFL is also investigated.
keywords: Federated Learning, Distributed Optimization,Predictive Coding

摘要: 联邦学习可以使远程工作者协同训练共享的机器学习模型,同时允许训练数据在本地保存。在无线移动设备的用例中,由于有限的功率和带宽,通信开销是一个关键的瓶颈。之前的工作已经使用了量化和稀疏化等各种数据压缩工具来减少开销。本文提出了一种基于预测编码的联邦学习压缩方案。该方案在所有设备之间共享预测功能,并允许每个工作者传输来自参考的压缩残差向量。在每一轮通信中,我们根据率失真代价选择预测器和量化器,并通过熵编码进一步减少冗余。大量的仿真表明,与其他基线方法相比,该方法的通信开销可以降低99 %,并且具有更好的学习性能。
关键字: 联邦学习;分布式优化;预测代码

DOI: 10.1109/WCNC51071.2022.9771883
原文链接: Communication-Efficient Federated Learning via Predictive Coding