Section: New Results
Preserving differential privacy under finite-precision semantics
Participant : Dale Miller.
(Joint work with Ivan Gazeau and Catuscia Palamidessi). The approximation introduced by finite-precision representation of continuous data can induce arbitrarily large information leaks even when the computation using exact semantics is secure. Such leakage can thus undermine design efforts aimed at protecting sensitive information. Gazeau, Miller, and Palamidessi [16] have applied differential privacy—an approach to privacy that emerged from the area of statistical databases—to this problem. In their approach, privacy is protected by the addition of noise to a true (private) value. To date, this approach to privacy has been proved correct only in the ideal case in which computations are made using an idealized, infinite-precision semantics. An analysis of implementation levels, where the semantics is necessarily finite-precision, i.e. the representation of real numbers and the operations on them are rounded according to some level of precision. In general there are violations of the differential privacy property but a limited (but, arguably, totally acceptable) variant of the property can be used insteand, under only a minor degradation of the privacy level. In fact, two cases of noise-generating distributions can be employed: the standard Laplacian mechanism commonly used in differential privacy, and a bivariate version of the Laplacian recently introduced in the setting of privacy-aware geolocation.