A tensorflow implementation of Importance Weighted Auto Encoder [1]
- tensorflow
- numpy
- matplotlib
python main.py --dataset {mnist,omniglot} \
--k <# of particles for training> \
--test_k <# number of particles for testing> \
--n_steps <# of steps> \
--batch_size <batch size>
- MNIST - automatically downloaded by tensorflow
- OMNIGLOT - run
download_omniglot.sh
The following are the log-likelihood values after training for 400,000 steps with a batch size of 100 for different number of particles (k
) and test_k = 5000
.
k | NLL (MNIST) | NLL (OMNIGLOT) |
---|---|---|
1 | 90.26 | 114.68 |
5 | 88.49 | 112.25 |
50 | 87.34 | 110.31 |
[1] Burda, Y., Grosse, R. and Salakhutdinov, R., 2015. Importance Weighted Autoencoders. arXiv preprint arXiv:1509.00519.