janosh / awesome-normalizing-flows Goto Github PK
View Code? Open in Web Editor NEWAwesome resources on normalizing flows.
License: MIT License
Awesome resources on normalizing flows.
License: MIT License
https://www.youtube.com/watch?v=IuXU2dBOJyw
Hi @janosh,
I just uploaded a comprehensive tutorial on YouTube.
Since it is my video, I am not sending the pull request and request you to assess if it is good enough to be part of this repository.
Regards & thanks
Kapil
Hi! This page is awesome (hence the name :) However it would be even more useful if you listed things (especially code) newest first, instead of oldest first, ie reverse chronological order.
I would like to add the package jammy_flows to the collection (https://github.com/thoglu/jammy_flows). I developed it for normalizing-flow inference in the context of astro-particle physics (https://arxiv.org/abs/2008.05825), but I think it might be useful in general.
It stands for Joint Autoregressive Manifold (MY) flows and models a (conditional) PDF on a tensor product of manifolds. The sub-PDFs are connected autoregressively similar to Inverse Autoregressive Flows (IAF - Kingma et al. 2016), but compared to IAF generalize via 1) allowing for arbitrary (non-affine) coupling layers (every flow in the package is amortizable) and 2) because we have general couplings we can link flows on different manifolds (e.g. Euclidean and a sphere).
It is mostly designed for low-dimensional applications (maybe a few 10s of dimensions - although simple flows ĺike the affine flow should reasonably work at much higher dimensionality) and should be simple to set up.
For example, a 5-d PDF defined on a 3-dimensional Euclidean manifold and an autoregressively linked 2-sphere conditional PDF is defined like this (together this forms a joint distribution on the tensor product space
import jammy_flows
pdf=jammy_flows.pdf("e3+s2", "gg+n")
The first argument defines the autoregressive manifold structure, the second argument the exact flow layer used for each manifold.
Each flow is abbreviated with a letter (see below) - for example "g" stand for a Gaussinization flow layer and "n" for an autoregressive flow on the 2-sphere. The autoregressive connectivity and amortization (for conditional PDFs) is taken care of by the module, and in the configuration indicated by a "+" connecting the different options.
Without much tuning, you should get something that just works and has the properties of a flexible PDF... however high customization of flow parameters and connectivity is also possible if desired.
It implements a few state of the art flows that to my knowledge are not really found in other repositories yet (e.g. Gaussianization flows).
Currently implemented manifolds with respective flows (from the README):
Euclidean flows:
1-sphere Flows:
2-sphere Flows:
Interval Flows:
Simplex Flows:
All of those can be combined in any way and the package automatically manages the connectivity.
More info can also be found in the docs.
Best,
Thorsten
Hi, thanks for the comprehensive list and nice summary of materials. I would like to recommend SurVAE Flows to the list which I had an interesting read.
They present a generalized framework (SurVAE Flows) which encompasses Flows (deterministic maps) and VAEs (stochastic maps). By seeing a deterministic map (x=f(z)) as a limiting case of a stochastic map (x~p(x|z)), the ELBO can be reinterpreted as a change of variables formula for the stochastic maps. Moreover, stochastic maps are able to model surjections, which might be useful in incorporating bottleneck architectures to Flows. They also give few examples of surjective layers, which can be composed together with Flow layers.
Great overview!
Thanks for the effort
Is there a spot for the video tutorial created by one of the autohers of Normalizing Flows: An Introduction and Review of Current Methods and presented at ECCV2020?
Variational Inference using Normalizing Flows (VINF)
PDF: https://pierresegonne.github.io/VINF/
Medium: https://medium.com/swlh/normalizing-flows-are-not-magic-22752d0c924
GitHub: https://github.com/pierresegonne/VINF
Hello 👋,
TLDR. The lampe package implements normalizing flows with PyTorch. I believe this is relevant for this collection. I hope you like it!
I'm a researcher interested in simulation-based inference and posterior estimation. I have written a low-level library for amortized posterior estimation called lampe. Initially, LAMPE was relying on nflows
for its normalizing flows, but it quickly became a limitation. I was not happy with some of nflows
design choices. For instance, it is only possible to sample or evaluate batches and most operators do not support broadcasting. It is also not possible to use other networks than the built-in ones. I considered contributing to nflows
, but it seems the package is not actively developed anymore.
So I decided to implement my own normalizing flows within LAMPE. The goal was to rely as much as possible onto the already existing distributions and transformations of PyTorch. Unfortunately, PyTorch distributions and transforms are not modules, meaning that they don't implement a forward
method, you cannot send the parameters to GPU with .to('cuda')
or even get their parameters with .parameters()
. To solve this problem, LAMPE defines two (abstract) classes: DistributionModule
and TransformModule
. The former is any nn.Module
whose forward
method returns a PyTorch
Distribution
. Similarly, the latter is any nn.Module
whose forward
method returns a PyTorch Transform
. Then, what is a normalizing flow? It is simply a nn.Module
that is constructed from a base DistributionModule
and a list of TransformModule
.
This design allows for very concise implementations of quite complex flows. Currently, LAMPE implements masked autoregressive flow (MAF), neural spline flow (NSF), neural autoregressive flow (NAF) and NAF based on unconstrained monotonic neural network (UMNN). All these flows support coupling (2 passes for inverse), fully autoregressive (as many passes as features) or anything in between (see Graphical Normalizing Flows). And all of that in about 800 lines of code, including whitespace and documentation. If you are interested, take a look at the transformations and flows.
Here is a small example with a neural spline flow (NSF).
>>> import lampe
>>> flow = lampe.nn.flows.NSF(7, context=16, transforms=3, hidden_features=[64] * 3, activation='ELU')
>>> flow
NSF(
(transforms): ModuleList(
(0): SoftclipTransform(bound=5.0)
(1): MaskedAutoregressiveTransform(
(base): MonotonicRQSTransform(bins=8)
(order): [0, 1, 2, 3, 4, 5, 6]
(params): MaskedMLP(
(0): MaskedLinear(in_features=23, out_features=64, bias=True)
(1): ELU(alpha=1.0)
(2): MaskedLinear(in_features=64, out_features=64, bias=True)
(3): ELU(alpha=1.0)
(4): MaskedLinear(in_features=64, out_features=64, bias=True)
(5): ELU(alpha=1.0)
(6): MaskedLinear(in_features=64, out_features=161, bias=True)
)
)
(2): MaskedAutoregressiveTransform(
(base): MonotonicRQSTransform(bins=8)
(order): [6, 5, 4, 3, 2, 1, 0]
(params): MaskedMLP(
(0): MaskedLinear(in_features=23, out_features=64, bias=True)
(1): ELU(alpha=1.0)
(2): MaskedLinear(in_features=64, out_features=64, bias=True)
(3): ELU(alpha=1.0)
(4): MaskedLinear(in_features=64, out_features=64, bias=True)
(5): ELU(alpha=1.0)
(6): MaskedLinear(in_features=64, out_features=161, bias=True)
)
)
(3): MaskedAutoregressiveTransform(
(base): MonotonicRQSTransform(bins=8)
(order): [0, 1, 2, 3, 4, 5, 6]
(params): MaskedMLP(
(0): MaskedLinear(in_features=23, out_features=64, bias=True)
(1): ELU(alpha=1.0)
(2): MaskedLinear(in_features=64, out_features=64, bias=True)
(3): ELU(alpha=1.0)
(4): MaskedLinear(in_features=64, out_features=64, bias=True)
(5): ELU(alpha=1.0)
(6): MaskedLinear(in_features=64, out_features=161, bias=True)
)
)
(4): Inverse(SoftclipTransform(bound=5.0))
)
(base): DiagNormal(loc: torch.Size([7]), scale: torch.Size([7]))
)
The flow is currently a nn.Module
. To condition the flow with respect to a context y
, we call it. This returns a distribution which can be evaluated (log_prob
) or sampled (sample
) just like any torch
distribution.
>>> y = torch.randn(16)
>>> conditioned = flow(y)
>>> conditioned.sample()
tensor([ 1.1381, 0.3619, -1.9963, 0.2681, -0.1613, 0.1885, -0.4108])
>>> conditioned.sample((5, 6)).shape
torch.Size([5, 6, 7])
>>> x = torch.randn(7)
>>> conditioned.log_prob(x)
tensor(-8.6289, grad_fn=<AddBackward0>)
>>> x = torch.randn(5, 6, 7)
>>> conditioned.log_prob(x).shape
torch.Size([5, 6])
Add stochastic normalizing flows paper. Adds stochasticity to your NF to help do better sampling.
Has the author tried to apply normalizing flows to the temporal prediction field? Or use normalizing flows to time series data?
Would be good to go back in git
history and set the date_added
field on some of the older items in this collection.
date_added
should be mandatory for all newly added items.
Related #44.
You can check it at https://amaanv.com/articles under the headin 'Normalizing Flows'. I do it as a hobby and intend to write more. If you think it adds value to the repo I would love it if you can add it.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.