Privacy Amplification for Episodic Training Methods

Research output: Contribution to journalConference articlepeer-review

Abstract

It has been shown that differential privacy bounds improve when subsampling within a randomized mechanism. Episodic training, utilized in many standard machine learning techniques, uses a multistage subsampling procedure which has not been previously analyzed for privacy bound amplification. In this paper, we focus on improving the calculation of privacy bounds in episodic training by thoroughly analyzing privacy amplification due to subsampling with a multi-stage subsampling procedure. The newly developed bound can be incorporated into existing privacy accounting methods.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume3318
StatePublished - 2022
Event2022 International Conference on Information and Knowledge Management Workshops, CIKM-WS 2022 - Atlanta, United States
Duration: Oct 17 2022Oct 21 2022

Funding

This manuscript has been authored by UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the US Department of Energy (DOE). The publisher acknowledges the US government license to provide public access under the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan). Research sponsored by the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory, managed by UT-Battelle, LLC, for the U. S. Department of Energy. This manuscript has been authored by UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the US Department of Energy (DOE). The publisher acknowledges the US government license to provide public access under the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan).

FundersFunder number
DOE Public Access Plan
U.S. Department of Energy
Oak Ridge National Laboratory

    Fingerprint

    Dive into the research topics of 'Privacy Amplification for Episodic Training Methods'. Together they form a unique fingerprint.

    Cite this