Comments (3)
You're right - the distribution for q(z) and p(z) have the same shape for a single datapoint.
p(z) does not need to have a batch size dimension, as the kl
function supports broadcasting across the batch size dimension - so it's effectively as if every element in the batch has a p(z) of dimension latent_dim
.
Hope this helps!
from variational-autoencoder.
Thanks!
I'm not sure I follow:
-
the
kl
divergence has an analytic form in this case (we do not use samples fromp_z
to compute it, so I don't see an issue with the shape) -
p_z
does not produce a 1D tensor, it produces a tensor of shapelatent_dim
Am I missing something?
from variational-autoencoder.
Hi,
What I mean is that the ways to create q_z
and p_z
are different.
q_z = distributions.Normal(loc=q_mu, scale=q_sigma)
Here q_mu
and q_sigma
have the same shape of [batch_size, latent_dim]
, but it's not the case for p_z
.
Here is an example:
>>> x = tf.distributions.Normal(loc=np.zeros(3, dtype=np.float32),scale=np.ones(3,dtype=np.float32))
>>> sess.run(x.sample())
array([ 1.4619417, -0.7553768, 2.2952332], dtype=float32)
>>>
>>> y = tf.distributions.Normal(loc=np.zeros((2,3), dtype=np.float32),scale=np.ones((2,3),dtype=np.float32))
>>> sess.run(y.sample())
array([[ 0.73629814, 1.0238795 , 0.00905563],
[-1.0864298 , -1.0123701 , 0.7973023 ]], dtype=float32)
Of course, we calculate the kl
divergence based on the distributions x
and y
but I think the samples created by the two distributions should have the same shape, aren't they?
from variational-autoencoder.
Related Issues (20)
- Graphs for pyTorch version HOT 3
- Adaptation to CNN HOT 1
- SystemExit HOT 4
- unable to open file: name = 'dat/binarized_mnist.hdf5' HOT 10
- Tensor size mismatch in VariationalMeanField.forward
- Why not average over batch dimension? HOT 6
- Interpretation of each dimension on the shape HOT 1
- I very much hope that you can also give this paper Code implementation HOT 3
- Possible error in loss function HOT 2
- the expected_log_likelihood is not a expected value, but only an log likelihood HOT 1
- Size of output weights file HOT 2
- Beta lower than one HOT 11
- deprecation warnings for tensorflow HOT 3
- AttributeError: module 'flow' has no attribute 'InverseAutoregressiveFlow'
- Surprising results with no convergence HOT 4
- A question regarding q_z HOT 2
- Regarding the loss function HOT 1
- Follow up on why inputs must be between 0 and 1 HOT 1
- Dataset is lost
- Working version for Python 3+
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from variational-autoencoder.