Bounded-leakage differential privacy

Katrina Ligett, Charlotte Peale, Omer Reingold

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

We introduce and study a relaxation of differential privacy [3] that accounts for mechanisms that leak some additional, bounded information about the database. We apply this notion to reason about two distinct settings where the notion of differential privacy is of limited use. First, we consider cases, such as in the 2020 US Census [1], in which some information about the database is released exactly or with small noise. Second, we consider the accumulation of privacy harms for an individual across studies that may not even include the data of this individual. The tools that we develop for bounded-leakage differential privacy allow us reason about privacy loss in these settings, and to show that individuals preserve some meaningful protections.

Original languageEnglish
Title of host publication1st Symposium on Foundations of Responsible Computing, FORC 2020
EditorsAaron Roth
PublisherSchloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
ISBN (Electronic)9783959771429
DOIs
StatePublished - 1 May 2020
Event1st Symposium on Foundations of Responsible Computing, FORC 2020 - Virtual, Cambridge, United States
Duration: 1 Jun 2020 → …

Publication series

NameLeibniz International Proceedings in Informatics, LIPIcs
Volume156
ISSN (Print)1868-8969

Conference

Conference1st Symposium on Foundations of Responsible Computing, FORC 2020
Country/TerritoryUnited States
CityVirtual, Cambridge
Period1/06/20 → …

Bibliographical note

Publisher Copyright:
© Katrina Ligett, Charlotte Peale, and Omer Reingold; licensed under Creative Commons License CC-BY

Keywords

  • Applications
  • Auxiliary information
  • Differential privacy
  • Leakage
  • Privacy

Fingerprint

Dive into the research topics of 'Bounded-leakage differential privacy'. Together they form a unique fingerprint.

Cite this