Differential privacy

July 28, 2016 — January 24, 2020

confidentiality
statistics

Another thing I won’t have time to blog or fully understand, but will collect a few explanatory blog posts about for emergency cribbing.

Learning Statistics with Privacy, aided by the Flip of a Coin:

Let’s say you wanted to count how many of your online friends were dogs, while respecting the maxim that, on the Internet, nobody should know you’re a dog. To do this, you could ask each friend to answer the question “Are you a dog?” in the following way. Each friend should flip a coin in secret, and answer the question truthfully if the coin came up heads; but, if the coin came up tails, that friend should always say “Yes” regardless. Then you could get a good estimate of the true count from the greater-than-half fraction of your friends that answered “Yes”. However, you still wouldn’t know which of your friends was a dog: each answer “Yes” would most likely be due to that friend’s coin flip coming up tails.

NB this would need to be a weighted coin, or you don’t learn anything.

This has recently become particularly publicly interesting because the US census has fingered mathematical differential privacy methods for preserving literal citizen privacy. This has spawned some good layperson’s introductions:

Alexandra Wood et al, Differential Privacy: A Primer for a Non-Technical Audience, and Mark Hansen has written an illustrated explanation.

There is a fun paper (Dimitrakakis et al. 2013) arguing that Bayesian posterior sampling has certain differential privacy guarantees.

Practical: see Google’s differential privacy library for miscellaneous reporting. PPRL, Privacy-Preserving-Record-Linkage is an R package for probabilistically connecting data sets in an (optionally) privacy-compatible way. There is a review of several libraries by Nils Amiet.

1 References

Bassily, Nissim, Smith, et al. 2015. Algorithmic Stability for Adaptive Data Analysis.” arXiv:1511.02513 [Cs].
Charles-Edouard, Maxime, Ludovic, et al. 2015. Unbiasedness of Some Generalized Adaptive Multilevel Splitting Algorithms.” arXiv:1505.02674 [Math, Stat].
Dimitrakakis, Nelson, Zhang, et al. 2013. Bayesian Differential Privacy Through Posterior Sampling.” arXiv:1306.1066 [Cs, Stat].
Dwork. 2006. Differential Privacy.” In.
Dwork, Feldman, Hardt, et al. 2015a. Preserving Statistical Validity in Adaptive Data Analysis.” In Proceedings of the Forty-Seventh Annual ACM on Symposium on Theory of Computing - STOC ’15.
Dwork, Feldman, Hardt, et al. 2015b. The Reusable Holdout: Preserving Validity in Adaptive Data Analysis.” Science.
———, et al. 2017. Guilt-Free Data Reuse.” Communications of the ACM.
Fanti, Pihur, and Erlingsson. 2015. Building a RAPPOR with the Unknown: Privacy-Preserving Learning of Associations and Data Dictionaries.” arXiv:1503.01214 [Cs].
Farokhi. 2020. Distributionally-Robust Machine Learning Using Locally Differentially-Private Data.” arXiv:2006.13488 [Cs, Math, Stat].
Jung, Ligett, Neel, et al. 2019. A New Analysis of Differential Privacy’s Generalization Guarantees.” arXiv:1909.03577 [Cs, Stat].
Sadeghi, Wang, Ma, et al. 2020. Learning While Respecting Privacy and Robustness to Distributional Uncertainties and Adversarial Data.” arXiv:2007.03724 [Cs, Eess, Math].
Tschantz, Sen, and Datta. 2019. Differential Privacy as a Causal Property.” arXiv:1710.05899 [Cs].
Wood, Altman, Bembenek, et al. 2019. Differential Privacy: A Primer for a Non-Technical Audience.”
Zhang, Rubinstein, and Dimitrakakis. 2016. On the Differential Privacy of Bayesian Inference.” In Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. AAAI’16.