On Differentially Private Sampling from Gaussian and Product Distributions
Abstract
In this work we study the problem of generating a new sample, with differential privacy (DP), from an unknown black-box distribution. The privacy is with respect to the i.i.d. samples given from the black-box, i.e., the distribution of the generated sample should be near-invariant even if one of the input samples is change and the utility is with respect to how close the generated sample is to the black-box. We show that this DP sampling problem is quantitatively easier than the DP learning problem, when the unknown distribution is a
high-dimensional Gaussian, under various levels of knowledge about the covariance; we also show that our bounds are near-tight. For product distributions on the binary hypercube, we also show a qualitative improvement over existing results, obtaining a pure-DP sampling algorithm.
high-dimensional Gaussian, under various levels of knowledge about the covariance; we also show that our bounds are near-tight. For product distributions on the binary hypercube, we also show a qualitative improvement over existing results, obtaining a pure-DP sampling algorithm.