Journal on Communications ›› 2023, Vol. 44 ›› Issue (5): 79-93.doi: 10.11959/j.issn.1000-436x.2023072

• Papers • Previous Articles     Next Articles

Communication-efficient federated learning method via redundant data elimination

Kaiju LI1,2, Qiang XU3, Hao WANG1,4   

  1. 1 School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
    2 College of Computer Science, Chongqing University, Chongqing 400044, China
    3 Department of Electrical Engineering, City University of Hong Kong, Hong Kong 999077, China
    4 Key Laboratory of Tourism Multisource Data Perception and Decision, Ministry of Culture and Tourism, Chongqing University of Posts and Telecommunications, Chongqing 400065, China
  • Revised:2023-02-04 Online:2023-05-25 Published:2023-05-01
  • Supported by:
    The National Natural Science Foundation of China(42001398);The Natural Science Foundation of Chongqing(cstc2020jcyj-msxmX0635);Chongqing Postdoctoral Research Program Special Funding(2021XM3009);China Postdoctoral Foundation(2021M693929)

Abstract:

To address the influence of limited network bandwidth of edge devices on the communication efficiency of federated learning, and efficiently transmit local model update to complete model aggregation, a communication-efficient federated learning method via redundant data elimination was proposed.The essential reasons for generation of redundant update parameters and according to non-IID properties and model distributed training features of FL were analyzed, a novel sensitivity and loss function tolerance definitions for coreset was given, and a novel federated coreset construction algorithm was proposed.Furthermore, to fit the extracted coreset, a novel distributed adaptive sparse network model evolution mechanism was designed to dynamically adjust the structure and the training model size before each global training iteration, which reduced the number of communication bits between edge devices and the server while also guarantees the training model accuracy.Experimental results show that the proposed method achieves 17% reduction in communication bits transmission while only 0.5% degradation in model accuracy compared with state-of-the-art method.

Key words: federated learning, communication efficiency, coreset, model evolution, accuracy

CLC Number: 

No Suggested Reading articles found!