[PDF] Testing differential privacy with dual interpreters | Semantic Scholar (2024)

Skip to search formSkip to main contentSkip to account menu

Semantic ScholarSemantic Scholar's Logo
@article{Zhang2020TestingDP, title={Testing differential privacy with dual interpreters}, author={Hengchu Zhang and Edo Roth and Andreas Haeberlen and Benjamin C. Pierce and Aaron Roth}, journal={Proceedings of the ACM on Programming Languages}, year={2020}, volume={4}, pages={1 - 26}, url={https://api.semanticscholar.org/CorpusID:222208609}}
  • Hengchu Zhang, Edo Roth, Aaron Roth
  • Published in Proc. ACM Program. Lang. 8 October 2020
  • Computer Science
  • Proceedings of the ACM on Programming Languages

The proposed DPCheck framework requires no programmer annotations, handles all previously verified or tested algorithms, and is the first fully automated framework to distinguish correct and buggy implementations of PrivTree, a probabilistically terminating algorithm that has not previously been mechanically checked.

12 Citations

Background Citations

7

Methods Citations

3

Figures from this paper

  • figure 1
  • figure 3

Topics

DP-Finder (opens in a new tab)Differential Privacy (opens in a new tab)PrivTree (opens in a new tab)Disclosure Avoidance System (opens in a new tab)Distributed Antenna Systems (opens in a new tab)

Ask This Paper

BETA

AI-Powered

Our system tries to constrain to information found in this paper. Results quality may vary. Learn more about how we generate these answers.

Feedback?

12 Citations

Coupled Relational Symbolic Execution for Differential Privacy
    Gian Pietro FarinaStephen ChongMarco Gaboardi

    Computer Science

    ESOP

  • 2021

This work designs a relational symbolic execution technique which supports reasoning about probabilistic coupling, a formal notion that has been shown useful to structure proofs of differential privacy.

Contextual Linear Types for Differential Privacy
    Matías ToroDavid Darais É. Tanter

    Computer Science

    ACM Trans. Program. Lang. Syst.

  • 2023

Jazz is presented, a language and type system that uses linear types and latent contextual effects to support both advanced variants of differential privacy and higher-order programming, and its expressive power is illustrated through a number of case studies drawn from the recent differential privacy literature.

  • 3
  • PDF
The Complexity of Verifying Boolean Programs as Differentially Private
    Mark BunMarco GaboardiL. Glinskih

    Computer Science, Mathematics

    2022 IEEE 35th Computer Security Foundations…

  • 2022

The complexity of the problem of verifying differential privacy for while-like programs working over boolean values and making probabilistic choices is studied and it is shown that the issue of deciding whether a program is differentially private for specific values of the privacy parameters is PSPACE-complete.

Synthesizing Tight Privacy and Accuracy Bounds via Weighted Model Counting
    Lisa OakleySteven HoltzenAlina Oprea

    Computer Science

    ArXiv

  • 2024

This work develops a method for tight privacy and accuracy bound synthesis using weighted model counting on binary decision diagrams, a state of the art technique from the artificial intelligence and automated reasoning communities for exactly computing probability distributions.

Verifying Pufferfish Privacy in Hidden Markov Models
    Depeng LiuBow-Yaw WangLijun Zhang

    Computer Science, Mathematics

    VMCAI

  • 2022

An automatic verification technique for Pufferfish privacy is proposed that can be combined with testing based approach for the purpose of efficiently certifying counterexamples and obtaining a better lower bound for the privacy budget $\epsilon$.

Privacy accounting εconomics: Improving differential privacy composition via a posteriori bounds
    Valentin HartmannVincent BindschaedlerAlexander BentkampRobert West

    Computer Science

    Proc. Priv. Enhancing Technol.

  • 2022

This paper observes that certain DP mechanisms are amenable to a posteriori privacy analysis that exploits the fact that some outputs leak less information about the input database than others, and introduces output differential privacy (ODP) and a new composition experiment to exploit this phenomenon.

Differential Privacy for Coverage Analysis of Software Traces (Artifact)
    Yu HaoS. LatifHailong ZhangRaef BassilyA. Rountev

    Computer Science

    Dagstuhl Artifacts Ser.

  • 2021

A novel approach to identify hot traces from the collected randomized sketches is developed, showing that the very large domain of possible traces can be efficiently explored for hot traces by using the frequency estimates of a visited trace and its prefixes and suffixes.

  • 2
  • PDF
DP-Sniper: Black-Box Discovery of Differential Privacy Violations using Classifiers
    Benjamin BichselSamuel SteffenIlija BogunovicMartin T. Vechev

    Computer Science

    2021 IEEE Symposium on Security and Privacy (SP)

  • 2021

DP-Sniper is a practical black-box method that automatically finds violations of differential privacy by training a classifier to predict if an observed output was likely generated from one of two possible inputs and transforming this classifier into an approximately optimal attack on differential privacy.

  • 24
  • PDF
Eureka: A General Framework for Black-box Differential Privacy Estimators
    Yun LuYu WeiM. Magdon-IsmailVassilis Zikas

    Computer Science

    IACR Cryptol. ePrint Arch.

  • 2022

This work devise a methodology for domain experts with limited knowledge of security to estimate the (differential) privacy of an arbitrary mechanism and develops the first approximate estimator for the standard, i.e., non-relative, definition of Distributional Differential Privacy (DDP).

  • 6
  • PDF
Do I Get the Privacy I Need? Benchmarking Utility in Differential Privacy Libraries
    Gonzalo Munilla GarridoJoseph P. NearAitsam MuhammadWarren HeRoman MatzuttF. Matthes

    Computer Science

    ArXiv

  • 2021

It is concluded that these libraries provide similar utility (except in some notable scenarios), however, there are significant differences in the features provided, and it is found that no single library excels in all areas.

...

...

49 References

Proving differential privacy with shadow execution
    Yuxin WangZeyu DingGuanhong WangDaniel KiferDanfeng Zhang

    Computer Science

    PLDI

  • 2019

This paper proposes a new proof technique, called shadow execution, and embed it into a language called ShadowDP, which uses shadow execution to generate proofs of differential privacy with very few programmer annotations and without relying on customized logics and verifiers.

Automated Methods for Checking Differential Privacy
    G. BartheRohit ChadhaV. JagannathA. SistlaMahesh Viswanathan

    Computer Science

    ArXiv

  • 2019

This work proposes the first decision procedure for checking the differential privacy of a non-trivial class of probabilistic computations and implements it for (dis)proving privacy bounds for many well-known examples, including randomized response, histogram, report noisy max and sparse vector.

  • 7
Advanced Probabilistic Couplings for Differential Privacy
    G. BartheMarco GaboardiBenjamin Gr'egoireJustin HsuPierre-Yves Strub

    Computer Science, Mathematics

    CCS

  • 2016

A new formalism extending apRHL, a relational program logic that has been used for proving differential privacy of non-interactive algorithms, and incorporating a HL, a (non-relational) program logic for accuracy properties is addressed, which exemplifies the three classes of algorithms and explores new variants of the Sparse Vector technique.

Distance makes the types grow stronger: a calculus for differential privacy
    J. ReedB. Pierce

    Computer Science, Mathematics

    ICFP '10

  • 2010

This work proposes to streamline the proving of algorithms to be differentially private one at a time with a functional language whose type system automatically guarantees differential privacy, allowing the programmer to write complex privacy-safe query programs in a flexible and compositional way.

  • 248
  • PDF
LightDP: towards automating differential privacy proofs
    Danfeng ZhangDaniel Kifer

    Computer Science

    POPL

  • 2017

It is shown that LightDP verifies sophisticated algorithms with little manual effort, a novel relational type system that separates relational reasoning from privacy budget calculations that is powerful enough to verify sophisticated algorithms where the composition theorem falls short.

Synthesizing coupling proofs of differential privacy
    Aws AlbarghouthiJustin Hsu

    Computer Science

    Proc. ACM Program. Lang.

  • 2018

A push-button, automated technique for verifying ε-differential privacy of sophisticated randomized algorithms and provides the first automated privacy proofs for a number of challenging algorithms from the differential privacy literature, including Report Noisy Max, the Exponential Mechanism, and the Sparse Vector Mechanism.

  • 65
  • Highly Influential
  • [PDF]
Linear dependent types for differential privacy
    Marco GaboardiAndreas HaeberlenJustin HsuArjun NarayanB. Pierce

    Computer Science, Mathematics

    POPL

  • 2013

DFuzz is presented, an extension of Fuzz with a combination of linear indexed types and lightweight dependent types that allows a richer sensitivity analysis that is able to certify a larger class of queries as differentially private, including ones whose sensitivity depends on runtime information.

  • 186
  • PDF
Detecting Violations of Differential Privacy
    Zeyu DingYuxin WangGuanhong WangDanfeng ZhangDaniel Kifer

    Computer Science

    CCS

  • 2018

The problem of producing counterexamples for incorrect algorithms that make them violate their claimed privacy is considered and an evaluation on a variety of incorrect published algorithms validates the usefulness of the approach.

  • 113
  • Highly Influential
  • [PDF]
Proving Differential Privacy via Probabilistic Couplings
    G. BartheMarco GaboardiB. GrégoireJustin HsuPierre-Yves Strub

    Computer Science

    2016 31st Annual ACM/IEEE Symposium on Logic in…

  • 2016

This paper develops compositional methods for formally verifying differential privacy for algorithms whose analysis goes beyond the composition theorem, based on deep connections between differential privacy and probabilistic couplings, an established mathematical tool for reasoning about stochastic processes.

Duet: an expressive higher-order language and linear type system for statically enforcing differential privacy
    Joseph P. NearDavid Darais D. Song

    Computer Science, Mathematics

    Proc. ACM Program. Lang.

  • 2019

This work proposes Duet, an expressive higher-order language, linear type system and tool for automatically verifying differential privacy of general-purpose higher- order programs and implements several differentially private machine learning algorithms in Duet which have never before been automatically verified by a language-based tool.

  • 29
  • PDF

...

...

Related Papers

Showing 1 through 3 of 0 Related Papers

    [PDF] Testing differential privacy with dual interpreters | Semantic Scholar (2024)

    References

    Top Articles
    Latest Posts
    Article information

    Author: Edwin Metz

    Last Updated:

    Views: 6061

    Rating: 4.8 / 5 (58 voted)

    Reviews: 81% of readers found this page helpful

    Author information

    Name: Edwin Metz

    Birthday: 1997-04-16

    Address: 51593 Leanne Light, Kuphalmouth, DE 50012-5183

    Phone: +639107620957

    Job: Corporate Banking Technician

    Hobby: Reading, scrapbook, role-playing games, Fishing, Fishing, Scuba diving, Beekeeping

    Introduction: My name is Edwin Metz, I am a fair, energetic, helpful, brave, outstanding, nice, helpful person who loves writing and wants to share my knowledge and understanding with you.