Giter Site home page Giter Site logo

lrmf's Introduction

Log-Likelihood Ratio Minimizing Flows

We introduce a new adversarial log-likelihood ratio domain alignment objective, show how to upper-bound it with a simple stable minimization objective, if the domain transformation is a normalizing flow, and show its relation to Jensen–Shannon divergence and GANs.

Log-Likelihood Ratio Minimizing Flows: Towards Robust and Quantifiable Neural Distribution Alignment
Ben Usman, Nick Dufour, Avneesh Sud, Kate Seanko
Advances in Neural Information Processing Systems (NeurIPS) 2020
video [3min] / poster / proceedings / bib

Distribution alignment has many applications in deep learning, including domain adaptation and unsupervised image-to-image translation. Most prior work on unsupervised distribution alignment relies either on minimizing simple non-parametric statistical distances such as maximum mean discrepancy or on adversarial alignment. However, the former fails to capture the structure of complex real-world distributions, while the latter is difficult to train and does not provide any universal convergence guarantees or automatic quantitative validation procedures. In this paper, we propose a new distribution alignment method based on a log-likelihood ratio statistic and normalizing flows. We show that, under certain assumptions, this combination yields a deep neural likelihood-based minimization objective that attains a known lower bound upon convergence. We experimentally verify that minimizing the resulting objective results in domain alignment that preserves the local structure of input domains.

Experiments

This repo constains four (mostly) self-contained experiments implemented in JAX and Tensorflow Probability. Some of them are heavy enough to make the github viewer fail, so we suggest using colab links to view experiments in the following order:

  • jax_gaussian_and_real_nvp.ipynb [colab]: we define 1D and 2D Gaussian likelihood-ratio minimizing flows (LRMF) and see that they sucessfully align normally-distributed distributions, and first two moments of moons distributions. After that we define a RealNVP LRMF and apply it to normally distributed datasets confirming that LRMF works correctly in the overparameterized regime.
  • tfp_moons_real_nvp_ffjord.ipynb [colab]: we define RealNVP and FFJORD LRMFs and show that they sucesfully align moon datasets. We also show that our approach produces more semantically meaningful alignment on this dataset than alternatives.
  • tfp_digit_embeddings_real_nvp.ipynb [colab]: we embed MNIST and USPS into a lower-dimentional space using a shared VAEGAN and train RealNVP LRMF to align resulting embedding clouds.
  • jax_gmm_vanishing_gradient.ipynb [colab]: we show that even 1D Equal Gaussian Mixture LRMF suffers from poly-exponentially vanishing generator gradients as distributions are drawn further apart (see Example 2.2 in the paper).
  • tfp_digits_glow_fails.ipynb [colab] (uses glow_lrmf.py from this repo): an example of a failure due to vanishing gradients in higher dimentions - GLOW fails to converge to a global minimum. We implemented stochastic Polyak step size, but it did not help.

Visualizations

An animated version of Figure 4 from the paper.

Columns (left to right): MMD, EMD, back-to-back flow, LRMF.

Rows: distributions (align red to blue), points colored according to their local coordinate system, class labels of the "red" dataset according = "moon arm", classification accuracy, corresponding loss.

Here is final flow that LRMF learned to align moon distributions:

Citation

@inproceedings{usman2020lrmf,
 author = {Usman, Ben and Sud, Avneesh and Dufour, Nick and Saenko, Kate},
 booktitle = {Advances in Neural Information Processing Systems},
 editor = {H. Larochelle and M. Ranzato and R. Hadsell and M. F. Balcan and H. Lin},
 pages = {21118--21129},
 publisher = {Curran Associates, Inc.},
 title = {Log-Likelihood Ratio Minimizing Flows: Towards Robust and Quantifiable Neural Distribution Alignment},
 url = {https://proceedings.neurips.cc/paper/2020/file/f169b1a771215329737c91f70b5bf05c-Paper.pdf},
 volume = {33},
 year = {2020}
}

lrmf's People

Contributors

minner avatar

Stargazers

 avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.