Utility-Optimized Local Differential Privacy Mechanisms for Distribution Estimation

Authors: 

Takao Murakami and Yusuke Kawamoto, AIST

Abstract: 

LDP (Local Differential Privacy) has been widely studied to estimate statistics of personal data (e.g., distribution underlying the data) while protecting users' privacy. Although LDP does not require a trusted third party, it regards all personal data equally sensitive, which causes excessive obfuscation hence the loss of utility. In this paper, we introduce the notion of ULDP (Utility-optimized LDP), which provides a privacy guarantee equivalent to LDP only for sensitive data. We first consider the setting where all users use the same obfuscation mechanism, and propose two mechanisms providing ULDP: utility-optimized randomized response and utility-optimized RAPPOR. We then consider the setting where the distinction between sensitive and non-sensitive data can be different from user to user. For this setting, we propose a personalized ULDP mechanism with semantic tags to estimate the distribution of personal data with high utility while keeping secret what is sensitive for each user. We show theoretically and experimentally that our mechanisms provide much higher utility than the existing LDP mechanisms when there are a lot of non-sensitive data. We also show that when most of the data are non-sensitive, our mechanisms even provide almost the same utility as non-private mechanisms in the low privacy regime.

BibTeX
@inproceedings {236288,
title = {Utility-Optimized Local Differential Privacy Mechanisms for Distribution Estimation},
booktitle = {28th {USENIX} Security Symposium ({USENIX} Security 19)},
year = {2019},
address = {Santa Clara, CA},
url = {https://www.usenix.org/conference/usenixsecurity19/presentation/murakami},
publisher = {{USENIX} Association},
}