Making valid statistical inferences from privatized data is a key challenge in modern analysis. In Bayesian settings, data augmentation MCMC (DAMCMC) methods impute unobserved confidential data given noisy privatized summaries, enabling principled uncertainty quantification. However, standard DAMCMC often suffers from slow mixing due to component-wise Metropolis-within-Gibbs updates. We propose the Single-Offer-Multiple-Attempts (SOMA) sampler. This novel algorithm improves acceptance rates by generating a single proposal and simultaneously evaluating its suitability to replace all components. By sharing proposals across components, SOMA rejects fewer proposal points. We prove lower bounds on SOMA’s acceptance probability and establish convergence rates in the two-component case. Experiments on synthetic and real census data with linear regression and other models confirm SOMA’s efficiency gains.