Skip to search formSkip to main contentSkip to account menu
DOI:10.1145/3428233 - Corpus ID: 222208609
@article{Zhang2020TestingDP, title={Testing differential privacy with dual interpreters}, author={Hengchu Zhang and Edo Roth and Andreas Haeberlen and Benjamin C. Pierce and Aaron Roth}, journal={Proceedings of the ACM on Programming Languages}, year={2020}, volume={4}, pages={1 - 26}, url={https://api.semanticscholar.org/CorpusID:222208609}}
- Hengchu Zhang, Edo Roth, Aaron Roth
- Published in Proc. ACM Program. Lang. 8 October 2020
- Computer Science
- Proceedings of the ACM on Programming Languages
The proposed DPCheck framework requires no programmer annotations, handles all previously verified or tested algorithms, and is the first fully automated framework to distinguish correct and buggy implementations of PrivTree, a probabilistically terminating algorithm that has not previously been mechanically checked.
12 Citations
7
3
Figures from this paper
- figure 1
- figure 3
Topics
DP-Finder (opens in a new tab)Differential Privacy (opens in a new tab)PrivTree (opens in a new tab)Disclosure Avoidance System (opens in a new tab)Distributed Antenna Systems (opens in a new tab)
Ask This Paper
BETA
AI-Powered
Ask This Paper
BETA
AI-Powered
Unknown Error
An unexpected error occurred. Please try again.
No Answer Found
Ask another question that can be answered by this paper or rephrase your question.
We are still processing this paper
Please try again later.
Question Answering Unavailable
Please try again later.
No Response
The server took too long to answer your question. You can either rephrase your question or wait until it is less busy.
AI-Generated
Thank you for your feedback!
We're sorry, something went wrong while submitting this feedback.
Thank you for your feedback!
We're sorry, something went wrong while submitting this feedback.
Supporting Statements
Our system tries to constrain to information found in this paper. Results quality may vary. Learn more about how we generate these answers.
Feedback?
12 Citations
- Gian Pietro FarinaStephen ChongMarco Gaboardi
- 2021
Computer Science
ESOP
This work designs a relational symbolic execution technique which supports reasoning about probabilistic coupling, a formal notion that has been shown useful to structure proofs of differential privacy.
- Matías ToroDavid Darais É. Tanter
- 2023
Computer Science
ACM Trans. Program. Lang. Syst.
Jazz is presented, a language and type system that uses linear types and latent contextual effects to support both advanced variants of differential privacy and higher-order programming, and its expressive power is illustrated through a number of case studies drawn from the recent differential privacy literature.
- 3
- PDF
- Mark BunMarco GaboardiL. Glinskih
- 2022
Computer Science, Mathematics
2022 IEEE 35th Computer Security Foundations…
The complexity of the problem of verifying differential privacy for while-like programs working over boolean values and making probabilistic choices is studied and it is shown that the issue of deciding whether a program is differentially private for specific values of the privacy parameters is PSPACE-complete.
- Lisa OakleySteven HoltzenAlina Oprea
- 2024
Computer Science
ArXiv
This work develops a method for tight privacy and accuracy bound synthesis using weighted model counting on binary decision diagrams, a state of the art technique from the artificial intelligence and automated reasoning communities for exactly computing probability distributions.
- Depeng LiuBow-Yaw WangLijun Zhang
- 2022
Computer Science, Mathematics
VMCAI
An automatic verification technique for Pufferfish privacy is proposed that can be combined with testing based approach for the purpose of efficiently certifying counterexamples and obtaining a better lower bound for the privacy budget $\epsilon$.
- Valentin HartmannVincent BindschaedlerAlexander BentkampRobert West
- 2022
Computer Science
Proc. Priv. Enhancing Technol.
This paper observes that certain DP mechanisms are amenable to a posteriori privacy analysis that exploits the fact that some outputs leak less information about the input database than others, and introduces output differential privacy (ODP) and a new composition experiment to exploit this phenomenon.
- Yu HaoS. LatifHailong ZhangRaef BassilyA. Rountev
- 2021
Computer Science
Dagstuhl Artifacts Ser.
A novel approach to identify hot traces from the collected randomized sketches is developed, showing that the very large domain of possible traces can be efficiently explored for hot traces by using the frequency estimates of a visited trace and its prefixes and suffixes.
- 2
- PDF
- Benjamin BichselSamuel SteffenIlija BogunovicMartin T. Vechev
- 2021
Computer Science
2021 IEEE Symposium on Security and Privacy (SP)
DP-Sniper is a practical black-box method that automatically finds violations of differential privacy by training a classifier to predict if an observed output was likely generated from one of two possible inputs and transforming this classifier into an approximately optimal attack on differential privacy.
- 24
- PDF
- Yun LuYu WeiM. Magdon-IsmailVassilis Zikas
- 2022
Computer Science
IACR Cryptol. ePrint Arch.
This work devise a methodology for domain experts with limited knowledge of security to estimate the (differential) privacy of an arbitrary mechanism and develops the first approximate estimator for the standard, i.e., non-relative, definition of Distributional Differential Privacy (DDP).
- 6
- PDF
- Gonzalo Munilla GarridoJoseph P. NearAitsam MuhammadWarren HeRoman MatzuttF. Matthes
- 2021
Computer Science
ArXiv
It is concluded that these libraries provide similar utility (except in some notable scenarios), however, there are significant differences in the features provided, and it is found that no single library excels in all areas.
...
...
49 References
- Yuxin WangZeyu DingGuanhong WangDaniel KiferDanfeng Zhang
- 2019
Computer Science
PLDI
This paper proposes a new proof technique, called shadow execution, and embed it into a language called ShadowDP, which uses shadow execution to generate proofs of differential privacy with very few programmer annotations and without relying on customized logics and verifiers.
- 36 [PDF]
- G. BartheRohit ChadhaV. JagannathA. SistlaMahesh Viswanathan
- 2019
Computer Science
ArXiv
This work proposes the first decision procedure for checking the differential privacy of a non-trivial class of probabilistic computations and implements it for (dis)proving privacy bounds for many well-known examples, including randomized response, histogram, report noisy max and sparse vector.
- 7
- G. BartheMarco GaboardiBenjamin Gr'egoireJustin HsuPierre-Yves Strub
- 2016
Computer Science, Mathematics
CCS
A new formalism extending apRHL, a relational program logic that has been used for proving differential privacy of non-interactive algorithms, and incorporating a HL, a (non-relational) program logic for accuracy properties is addressed, which exemplifies the three classes of algorithms and explores new variants of the Sparse Vector technique.
- 55 [PDF]
- J. ReedB. Pierce
- 2010
Computer Science, Mathematics
ICFP '10
This work proposes to streamline the proving of algorithms to be differentially private one at a time with a functional language whose type system automatically guarantees differential privacy, allowing the programmer to write complex privacy-safe query programs in a flexible and compositional way.
- 248
- PDF
- Danfeng ZhangDaniel Kifer
- 2017
Computer Science
POPL
It is shown that LightDP verifies sophisticated algorithms with little manual effort, a novel relational type system that separates relational reasoning from privacy budget calculations that is powerful enough to verify sophisticated algorithms where the composition theorem falls short.
- 65 [PDF]
- Aws AlbarghouthiJustin Hsu
- 2018
Computer Science
Proc. ACM Program. Lang.
A push-button, automated technique for verifying ε-differential privacy of sophisticated randomized algorithms and provides the first automated privacy proofs for a number of challenging algorithms from the differential privacy literature, including Report Noisy Max, the Exponential Mechanism, and the Sparse Vector Mechanism.
- 65
- Highly Influential[PDF]
- Marco GaboardiAndreas HaeberlenJustin HsuArjun NarayanB. Pierce
- 2013
Computer Science, Mathematics
POPL
DFuzz is presented, an extension of Fuzz with a combination of linear indexed types and lightweight dependent types that allows a richer sensitivity analysis that is able to certify a larger class of queries as differentially private, including ones whose sensitivity depends on runtime information.
- 186
- PDF
- Zeyu DingYuxin WangGuanhong WangDanfeng ZhangDaniel Kifer
- 2018
Computer Science
CCS
The problem of producing counterexamples for incorrect algorithms that make them violate their claimed privacy is considered and an evaluation on a variety of incorrect published algorithms validates the usefulness of the approach.
- 113
- Highly Influential[PDF]
- G. BartheMarco GaboardiB. GrégoireJustin HsuPierre-Yves Strub
- 2016
Computer Science
2016 31st Annual ACM/IEEE Symposium on Logic in…
This paper develops compositional methods for formally verifying differential privacy for algorithms whose analysis goes beyond the composition theorem, based on deep connections between differential privacy and probabilistic couplings, an established mathematical tool for reasoning about stochastic processes.
- 90 [PDF]
- Joseph P. NearDavid Darais D. Song
- 2019
Computer Science, Mathematics
Proc. ACM Program. Lang.
This work proposes Duet, an expressive higher-order language, linear type system and tool for automatically verifying differential privacy of general-purpose higher- order programs and implements several differentially private machine learning algorithms in Duet which have never before been automatically verified by a language-based tool.
- 29
- PDF
...
...
Related Papers
Showing 1 through 3 of 0 Related Papers