TY - JOUR
T1 - ACCURACY FIRST
T2 - SELECTING A DIFFERENTIAL PRIVACY LEVEL FOR ACCURACY-CONSTRAINED ERM
AU - Ligett, Katrina
AU - Neel, Seth
AU - Roth, Aaron
AU - Waggoner, Bo
AU - Wu, Zhiwei Steven
N1 - Publisher Copyright:
© K. Ligett, S. Neel, A. Roth, B. Waggoner, and Z. Wu.
PY - 2019/10/23
Y1 - 2019/10/23
N2 - Traditional approaches to differential privacy assume a fixed privacy requirement ε for a computation, and attempt to maximize the accuracy of the computation subject to the privacy constraint. As differential privacy is increasingly deployed in practical settings, it may often be that there is instead a fixed accuracy requirement for a given computation and the data analyst would like to maximize the privacy of the computation subject to the accuracy constraint. This raises the question of how to find and run a maximally private empirical risk minimizer subject to a given accuracy requirement. We propose a general “noise reduction” framework that can apply to a variety of private empirical risk minimization (ERM) algorithms, using them to “search” the space of privacy levels to find the empirically strongest one that meets the accuracy constraint, and incurring only logarithmic overhead in the number of privacy levels searched. The privacy analysis of our algorithm leads naturally to a version of differential privacy where the privacy parameters are dependent on the data, which we term ex-post privacy, and which is related to the recently introduced notion of privacy odometers. We also give an ex-post privacy analysis of the classical AboveThreshold privacy tool, modifying it to allow for queries chosen depending on the database. Finally, we apply our approach to two common objective functions, regularized linear and logistic regression, and empirically compare our noise reduction methods to (i) inverting the theoretical utility guarantees of standard private ERM algorithms and (ii) a stronger, empirical baseline based on binary search.
AB - Traditional approaches to differential privacy assume a fixed privacy requirement ε for a computation, and attempt to maximize the accuracy of the computation subject to the privacy constraint. As differential privacy is increasingly deployed in practical settings, it may often be that there is instead a fixed accuracy requirement for a given computation and the data analyst would like to maximize the privacy of the computation subject to the accuracy constraint. This raises the question of how to find and run a maximally private empirical risk minimizer subject to a given accuracy requirement. We propose a general “noise reduction” framework that can apply to a variety of private empirical risk minimization (ERM) algorithms, using them to “search” the space of privacy levels to find the empirically strongest one that meets the accuracy constraint, and incurring only logarithmic overhead in the number of privacy levels searched. The privacy analysis of our algorithm leads naturally to a version of differential privacy where the privacy parameters are dependent on the data, which we term ex-post privacy, and which is related to the recently introduced notion of privacy odometers. We also give an ex-post privacy analysis of the classical AboveThreshold privacy tool, modifying it to allow for queries chosen depending on the database. Finally, we apply our approach to two common objective functions, regularized linear and logistic regression, and empirically compare our noise reduction methods to (i) inverting the theoretical utility guarantees of standard private ERM algorithms and (ii) a stronger, empirical baseline based on binary search.
KW - accuracy first
KW - differential privacy
KW - empirical risk minimization
UR - http://www.scopus.com/inward/record.url?scp=85130151258&partnerID=8YFLogxK
U2 - 10.29012/jpc.682
DO - 10.29012/jpc.682
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85130151258
SN - 2575-8527
VL - 9
SP - 1
EP - 24
JO - Journal of Privacy and Confidentiality
JF - Journal of Privacy and Confidentiality
IS - 2
ER -