Journal on Communications ›› 2022, Vol. 43 ›› Issue (10): 94-105.doi: 10.11959/j.issn.1000-436x.2022189

• Papers • Previous Articles     Next Articles

Research on federated learning approach based on local differential privacy

Haiyan KANG, Yuanrui JI   

  1. School of Information Management, Beijing Information Science and Technology University, Beijing 100192, China
  • Revised:2022-09-23 Online:2022-10-25 Published:2022-10-01
  • Supported by:
    The National Social Science Foundation of China(21BTQ079);The National Natural Science Foundation of China(61370139);The Ministry of Education of Humanities and Social Science Project(20YJAZH046);Beijing Advanced Innovation Center for Future Blockchain and Privacy Computing Fund

Abstract:

As a type of collaborative machine learning framework, federated learning is capable of preserving private data from participants while training the data into useful models.Nevertheless, from a viewpoint of information theory, it is still vulnerable for a curious server to infer private information from the shared models uploaded by participants.To solve the inference attack problem in federated learning training, a local differential privacy federated learning (LDP-FL) approach was proposed.Firstly, to ensure the federated model training process was protected from inference attacks, a local differential privacy mechanism was designed for transmission of parameters in federated learning.Secondly, a performance loss constraint mechanism for federated learning was proposed and designed to reduce the performance loss of local differential privacy federated model by optimizing the constraint range of the loss function.Finally, the effectiveness of proposed LDP-FL approach was verified by comparative experiments on MNIST and Fashion MNIST datasets.

Key words: differential privacy, federated learning, deep learning

CLC Number: 

No Suggested Reading articles found!