Differential privacy: getting more for less

  • Cynthia Dwork

    Department of Computer Science, Harvard John A. Paulson School of Engineering and Applied Sciences, Harvard University, 150 Western Avenue, Allston, MA 02134, USA
Differential privacy: getting more for less cover
Download Chapter PDF

This book chapter is published open access.

Abstract

The key to the success of differential privacy, now the gold standard for privacy-preserving data analysis, is the ability to quantify and reason about cumulative privacy loss over many differentially private interactions. When upper bounds on privacy loss are loose, the deployment of the algorithms is by definition conservative. Under high levels of composition, much potential utility is lost. We survey two general approaches to getting more utility: privacy amplification methods, which are algorithmic, and definitional methods, which admit a wider class of algorithms and lead to tighter analyses of existing algorithms.