Research Article
[Retracted] Differentially Private Singular Value Decomposition for Training Support Vector Machines
| Input: Raw data matrix , instances n, features d, privacy parameters ε, δ and β, accumulative contribution rate of principal components γ; | | Output: Classification model , private singular vectors Vk; | | Begin | | Generate a noise matrix , every entry is i.i.d. and sampled from N(0, β2); | | Add the noise matrix to the raw data matrix D’ = D + E; | | Compute the singular values σ and singular matrices U, V of D′ by SVD, ; | | Select the target dimension k according to ; | | Select first k singular vectors Vk to project the original training instances to the low-dimensional singular subspace Y = DVk; | | Compute the classification model f(x) in the singular subspace; | | Use f(x) and Vk to predict the new instances. | | End |
|