\dm_csml_event_details
Speaker |
Sesh Kumar |
---|---|
Affiliation |
Imperial College |
Date |
Friday, 18 January 2019 |
Time |
13:00-14:00 |
Location |
Zoom |
Link |
Roberts 421 |
Event series |
Jump Trading/ELLIS CSML Seminar Series |
Abstract |
Abstract. Differential privacy is concerned about the prediction quality while measuring the privacy impact on individuals whose information is contained in the data. We consider differentially private risk minimization problems with regularizers that induce structured sparsity. These regularizers are known to be convex but they are often non-differentiable. We analyze the standard differentially private algorithms, such as output perturbation and objective perturbation. Output perturbation is a differentially private algorithm that is known to perform well for minimizing risks that are strongly convex. Previous works have derived dimensionality independent excess risk bounds for these cases. In this paper, we assume a particular class of convex but non-smooth regularizers that induce structured sparsity and loss functions for generalized linear models. We derive excess risk bound for output perturbation that is independent of the dimensionality of the problem. We also show that the existing analysis for objective perturbation may be extended to these risk minimization problems. |
Biography |