✨ TL;DR
This paper introduces a GDP-based auditing framework that provides the first tight privacy audits for state-of-the-art differentially private synthetic data generators MST and AIM. The audits show a small gap between theoretical guarantees and empirical measurements in strong-privacy regimes.
Differentially Private (DP) synthetic data generators like MST and AIM are widely deployed in practice, but verifying that they actually achieve their claimed privacy guarantees is difficult. Existing auditing methods struggle to provide tight bounds, especially in strong-privacy regimes where privacy parameters are stringent. Without tight audits, there is uncertainty about whether these systems provide the privacy protection they promise, which is critical for real-world deployment where privacy violations could have serious consequences.
The authors develop an auditing framework based on Gaussian Differential Privacy (GDP) that measures privacy through the complete false-positive/false-negative tradeoff curve rather than single-point estimates. They apply this framework to audit MST and AIM synthetic data generators under worst-case settings, which represent the most challenging scenarios for privacy preservation. The GDP framework allows for more precise characterization of privacy loss by examining the full hypothesis testing tradeoff, providing tighter bounds than previous auditing approaches.