Journal on Communications ›› 2023, Vol. 44 ›› Issue (11): 79-93.doi: 10.11959/j.issn.1000-436x.2023196

• Topics: Distributed Edge Intelligence for Complex Environments • Previous Articles    

Client grouping and time-sharing scheduling for asynchronous federated learning in heterogeneous edge computing environment

Qianpiao MA1, Qingmin JIA1, Jianchun LIU2,3, Hongli XU2,3, Renchao XIE1,4, Tao HUANG1,4   

  1. 1 Future Network Research Center, Purple Mountain Laboratories, Nanjing 211111, China
    2 School of Computer Science and Technology, University of Science and Technology of China, Hefei 230026, China
    3 Suzhou Institute for Advanced Research, University of Science and Technology of China, Suzhou 215123, China
    4 State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China
  • Revised:2023-10-08 Online:2023-11-01 Published:2023-11-01
  • Supported by:
    The National Natural Science Foundation of China(U1709217);The National Natural Science Foundation of China(61936015);The National Natural Science Foundation of China(92267301)

Abstract:

To overcome the three key challenges of federated learning in heterogeneous edge computing, i.e., edge heterogeneity, data Non-IID, and communication resource constraints, a grouping asynchronous federated learning (FedGA) mechanism was proposed.Edge nodes were divided into multiple groups, each of which performed global updated asynchronously with the global model, while edge nodes within a group communicate with the parameter server through time-sharing communication.Theoretical analysis established a quantitative relationship between the convergence bound of FedGA and the data distribution among the groups.A time-sharing scheduling magic mirror method (MMM) was proposed to optimize the completion time of a single round of model updating within a group.Based on both the theoretical analysis for FedGA and MMM, an effective grouping algorithm was designed for minimizing the overall training completion time.Experimental results demonstrate that the proposed FedGA and MMM can reduce model training time by 30.1%~87.4% compared to the existing state-of-the-art methods.

Key words: edge computing, federated learning, Non-IID, heterogeneity, convergence analysis

CLC Number: 

No Suggested Reading articles found!