Research Article

D-(DP)2SGD: Decentralized Parallel SGD with Differential Privacy in Dynamic Networks

Algorithm 1

D-(DP)2SGD: Dynamic Decentralized Parallel Stochastic Gradient Descent.
Initialization
 Initial point , step length , noise variance and number of iterations
end
forin parallel for nodesdo
 Sample a training data ;
 Compute the stochastic gradient using the current local variable and the data ;
 Randomly generate the Laplace noise and add noise to the variable , to get the perturbed variable : ;
 Send the perturbed variable and its degree to its neighbors;
 Receive and from its neighbors, ;
 Determine according to Equation (11);
 Compute the neighborhood weighted average by obtaining perturbed variables from neighbors: ;
 Update its local variable ;
end
Output:.