Comments (9)
Hi! Thanks for submitting the PR, it looks great.
But I don't 100% understand how the re-parametrization trick works and if parameters can be used in a base function.
The important thing is that the base distribution isn't allowed to depend on any variational parameters. One easy fix might be to simply decide that we don't support parameterising nu
with a variational parameter. This might not be too much of a limitation if nu
is often set to some constant in practice? We might be able to enforce this by throwing an error when ad.isLifted(this.params.nu)
(because variational parameters will always be node of the AD graph during inference) while creating the base distribution. If you'd like to add this, that would be great. Alternatively, I'm happy to add it once this is merged.
In such case transform would produce stochastic output. Is it ok?
We assume that transform
is deterministic, so likely not. We could view the product of both the normal and the gamma as the base distribution leaving just a deterministic transform, but then the problem is that the gamma still depends on nu
, and the gamma itself isn't straight forward to reparameterise. For this reason, I think something along the lines of your current approach might be the best approach, at least initially.
from webppl.
Thank you @null-a! Is it better to add such check in the base function?
base: function () {
if (ad.isLifted(this.params.nu)) {
util.error("This distribution doesn't support parametrization of nu");
} else {
return new StudentT({nu: this.params.nu, mu: 0, sigma: 1});
}
}
Something like that?
from webppl.
Something like that?
Yes, that looks good to me.
I had a couple additional thoughts:
-
Would you be able to rename
mu -> location
andsigma -> scale
. We use both schemes with existing distributions, but people have previously expressed a preference forlocation
andscale
, so I think new distributions ought to use those. -
When performing variational inference, we generate a suitable default guide for any random choice that doesn't have one specified explicitly. By default, this will try to hook up
nu
to a variational parameter, which isn't what we want. Instead, we'll need to add some custom logic describing how to guide aStudentT
. (I think this will hook upmu
andsigma
to parameters as normal, and use the value ofnu
from the choice in the model. This is similar to what we already do for the bounds ofUniform
.) I can do this myself once the PR is merged, but feel free to take a look if you like.
from webppl.
We use both schemes with existing distributions, but people have previously expressed a preference for
location
andscale
No problem! I'll also rename nu
to df
(degree of freedom) then.
What do you think about having aliases for distributions params?
params: [
{name: 'mu', aliases:['loc', 'location', 'μ'] /*...*/},
{name: 'sigma', aliases:['scale', 'sd', 'σ'] /*...*/}
]
To have multiple styles working
Gaussian({mu: ..., sigma: ...})
Gaussian({μ: ..., σ: ...})
To implement that we could update the dist
constructor in base.ad.js
to map aliases from user provided param
object to internal this.param
. For example {loc: 0, scale: 1}
to {mu: 0, sigma: 1}
.
from webppl.
I can do this myself once the PR is merged, but feel free to take a look if you like.
It'd be awesome if you finish the variational inference logic after merging the PR. I don't really know how to do that right. Thank you!
from webppl.
What do you think about having aliases for distributions params?
I agree that we'd need to do something like this if we wanted to get rid of the current inconsistencies (e.g. Gaussian
uses mu
not location
), but I think having just location
and scale
is OK for new distributions. One concern would be the overhead incurred every time a distribution object is created, but this kind of thing tends to be negligible in JS, so I imagine this would be OK. (Related: #630)
from webppl.
It'd be awesome if you finish the variational inference logic after merging the PR.
Sure, no problem.
and use the value of nu from the choice in the model
For my own reference: I'm not confident that my earlier suggestion will always be correct. A safer thing to do would be to pick some fixed nu
to use for default guides. This isn't ideal, as user's would need to guide manually to specify a different nu
, and this won't be obvious. Is it possible that guiding with some other family (probably Gaussian
) would be a better default than StudentT
with fixed nu
? Also, if we do guide with StudentT
by default, can we hook up nu
to a variational parameter when not using reparameterisation?
from webppl.
Would you be able to rename
mu -> location
andsigma -> scale
Done. Also renamed nu
to df
(degrees of freedom) and updated the sampler test with new params. Probably ready to merge
from webppl.
Thanks, I've merged the PR. I decided to remove reparameterization altogether, since I wasn't confident my proposal was correct.
from webppl.
Related Issues (20)
- Generalise `expectation` to multivariate distributions
- Add reparameterisation for logNormal HOT 3
- Variable going out of scope when other variables are defined HOT 2
- Nested map and reduce HOT 3
- Support of multinomial broken by deprecated object() method from lodash
- get statistics from RVs HOT 2
- Pretty printing of distributions doesn't work as expected under Node 12 HOT 1
- `let` unexpectedly parsed as variable declaration
- [tutorial] unnecessary function wrapping HOT 1
- Documentation for Distribution.support() lacking HOT 2
- Support ES6 default parameters
- Is there a forum or help desk to ask some beginner questions? HOT 1
- Still maintained and StudentT distribution HOT 1
- REPL shell ? HOT 3
- Creating a new distribution ? Combinatorics ? HOT 1
- Unexpected output in installation from github HOT 3
- Usage help: observed data HOT 2
- .some() method behaves unexpectedly HOT 2
- reduce function processes arguments in the wrong order
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from webppl.