differential privacy

Simulation-based Bayesian inference from privacy-protected data

Many modern statistical analysis and machine learning applications require training models on sensitive user data. Differential privacy provides a formal guarantee that individual-level information about users does not leak. In this framework, …

Statistical Inference for Privatized Data with Unknown Sample Size

We develop both theory and algorithms to analyze privatized data in the unbounded differential privacy(DP), where even the sample size is considered a sensitive quantity that requires privacy protection. We show that the distance between the sampling …