This repository contains the implementation for GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators (NeurIPS 2020).
Contact: Dingfan Chen ([email protected])
The environment can be set up using Anaconda with the following commands:
conda create --name gswgan-pytorch python=3.6
conda activate gswgan-pytorch
conda install pytorch=1.2.0
conda install torchvision -c pytorch
pip install -r requirements.txt
cd source
sh pretrain.sh
- To run the training in parallel: adjust the
'meta_start'
argument and run the script multiple times in parallel. - Alternatively, you can download the pre-trained models using the links below.
cd source
python main.py -data 'mnist' -name 'ResNet_default' -ldir '../results/mnist/pretrain/ResNet_default'
-
Please refer to
source/config.py
(or executepython main.py -h
) for the complete list of arguments. -
The default setting require ~22G GPU memory. Please allocate multiple GPUs by specifying the
'-ngpus'
argument if it does not fit in the memory of one GPU.
- To compute the privacy cost:
cd evaluation python privacy_analysis.py -data 'mnist' -name 'ResNet_default'
Pre-trained model checkpoints can be downloaded using the links below. The discriminators are obtained after the
warm-starting step (step 1), while the generators are obtained after the DP training step (step 2).
The pre-trained models are stored as .pth
files and the corresponding training configurations are stored in
params.pkl
and params.txt
.
Generator | Discriminators | |
---|---|---|
MNIST | link | link |
Fashion-MNIST | link | link |
@inproceedings{neurips20chen,
title = {GS-WGAN: A Gradient-Sanitized Approach for Learning Differentially Private Generators},
author = {Dingfan Chen and Tribhuvanesh Orekondy and Mario Fritz},
year = {2020},
date = {2020-12-06},
booktitle = {Neural Information Processing Systems (NeurIPS)},
pubstate = {published},
tppubtype = {inproceedings}
}
Our implementation uses the source code from the following repositories: