Abstract
We introduce and study a relaxation of differential privacy [3] that accounts for mechanisms that leak some additional, bounded information about the database. We apply this notion to reason about two distinct settings where the notion of differential privacy is of limited use. First, we consider cases, such as in the 2020 US Census [1], in which some information about the database is released exactly or with small noise. Second, we consider the accumulation of privacy harms for an individual across studies that may not even include the data of this individual. The tools that we develop for bounded-leakage differential privacy allow us reason about privacy loss in these settings, and to show that individuals preserve some meaningful protections.
Original language | English |
---|---|
Title of host publication | 1st Symposium on Foundations of Responsible Computing, FORC 2020 |
Editors | Aaron Roth |
Publisher | Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing |
ISBN (Electronic) | 9783959771429 |
DOIs | |
State | Published - 1 May 2020 |
Event | 1st Symposium on Foundations of Responsible Computing, FORC 2020 - Virtual, Cambridge, United States Duration: 1 Jun 2020 → … |
Publication series
Name | Leibniz International Proceedings in Informatics, LIPIcs |
---|---|
Volume | 156 |
ISSN (Print) | 1868-8969 |
Conference
Conference | 1st Symposium on Foundations of Responsible Computing, FORC 2020 |
---|---|
Country/Territory | United States |
City | Virtual, Cambridge |
Period | 1/06/20 → … |
Bibliographical note
Publisher Copyright:© Katrina Ligett, Charlotte Peale, and Omer Reingold; licensed under Creative Commons License CC-BY
Keywords
- Applications
- Auxiliary information
- Differential privacy
- Leakage
- Privacy