The Laplace Mechanism has optimal utility for differential privacy over continuous queries
Differential Privacy protects individuals' data when statistical queries are published from aggregated databases: applying "obfuscating" mechanisms to the query results makes the released information less specific but, unavoidably, also decreases its utility. Yet it has been shown tha...
Uloženo v:
| Vydáno v: | Proceedings of the 36th Annual ACM/IEEE Symposium on Logic in Computer Science s. 1 - 12 |
|---|---|
| Hlavní autoři: | , , |
| Médium: | Konferenční příspěvek |
| Jazyk: | angličtina |
| Vydáno: |
IEEE
29.06.2021
|
| Témata: | |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Differential Privacy protects individuals' data when statistical queries are published from aggregated databases: applying "obfuscating" mechanisms to the query results makes the released information less specific but, unavoidably, also decreases its utility. Yet it has been shown that for discrete data (e.g. counting queries), a mandated degree of privacy and a reasonable interpretation of loss of utility, the Geometric obfuscating mechanism is optimal: it loses as little utility as possible [Ghosh et al. [1]].For continuous query results however (e.g. real numbers) the optimality result does not hold. Our contribution here is to show that optimality is regained by using the Laplace mechanism for the obfuscation.The technical apparatus involved includes the earlier discrete result [Ghosh op. cit.], recent work on abstract channels and their geometric representation as hyper-distributions [Alvim et al. [2]], and the dual interpretations of distance between distributions provided by the Kantorovich-Rubinstein Theorem. |
|---|---|
| DOI: | 10.1109/LICS52264.2021.9470718 |