Journal on Communications ›› 2018, Vol. 39 ›› Issue (1): 70-77.doi: 10.11959/j.issn.1000-436x.2018013

• Papers • Previous Articles     Next Articles

Stochastic gradient descent algorithm preserving differential privacy in MapReduce framework

Yihan YU,Yu FU,Xiaoping WU   

  1. Department of Information Security,Naval University of Engineering,Wuhan 430033,China
  • Revised:2017-12-19 Online:2018-01-01 Published:2018-02-07
  • Supported by:
    The National Natural Science Foundation of China(61100042);The National Social Science Foundation of China(15GJ003-201)

Abstract:

Aiming at the contradiction between the efficiency and privacy of stochastic gradient descent algorithm in distributed computing environment,a stochastic gradient descent algorithm preserving differential privacy based on MapReduce was proposed.Based on the computing framework of MapReduce,the data were allocated randomly to each Map node and the Map tasks were started independently to execute the stochastic gradient descent algorithm.The Reduce tasks were appointed to update the model when the sub-target update models were meeting the update requirements,and to add Laplace random noise to achieve differential privacy protection.Based on the combinatorial features of differential privacy,the results of the algorithm is proved to be able to fulfill ε-differentially private.The experimental results show that the algorithm has obvious efficiency advantage and good data availability.

Key words: machine learning, stochastic gradient descent, MapReduce, differential privacy preserving, Laplace mechanism

CLC Number: 

No Suggested Reading articles found!