Giter Site home page Giter Site logo

Comments (9)

null-a avatar null-a commented on July 26, 2024

Hi! Thanks for submitting the PR, it looks great.

But I don't 100% understand how the re-parametrization trick works and if parameters can be used in a base function.

The important thing is that the base distribution isn't allowed to depend on any variational parameters. One easy fix might be to simply decide that we don't support parameterising nu with a variational parameter. This might not be too much of a limitation if nu is often set to some constant in practice? We might be able to enforce this by throwing an error when ad.isLifted(this.params.nu) (because variational parameters will always be node of the AD graph during inference) while creating the base distribution. If you'd like to add this, that would be great. Alternatively, I'm happy to add it once this is merged.

In such case transform would produce stochastic output. Is it ok?

We assume that transform is deterministic, so likely not. We could view the product of both the normal and the gamma as the base distribution leaving just a deterministic transform, but then the problem is that the gamma still depends on nu, and the gamma itself isn't straight forward to reparameterise. For this reason, I think something along the lines of your current approach might be the best approach, at least initially.

from webppl.

zemlyansky avatar zemlyansky commented on July 26, 2024

Thank you @null-a! Is it better to add such check in the base function?

base: function () {
  if (ad.isLifted(this.params.nu)) {
    util.error("This distribution doesn't support parametrization of nu");
  } else {
    return new StudentT({nu: this.params.nu, mu: 0, sigma: 1});
  }
}

Something like that?

from webppl.

null-a avatar null-a commented on July 26, 2024

Something like that?

Yes, that looks good to me.

I had a couple additional thoughts:

  1. Would you be able to rename mu -> location and sigma -> scale. We use both schemes with existing distributions, but people have previously expressed a preference for location and scale, so I think new distributions ought to use those.

  2. When performing variational inference, we generate a suitable default guide for any random choice that doesn't have one specified explicitly. By default, this will try to hook up nu to a variational parameter, which isn't what we want. Instead, we'll need to add some custom logic describing how to guide a StudentT. (I think this will hook up mu and sigma to parameters as normal, and use the value ofnu from the choice in the model. This is similar to what we already do for the bounds of Uniform.) I can do this myself once the PR is merged, but feel free to take a look if you like.

from webppl.

zemlyansky avatar zemlyansky commented on July 26, 2024

We use both schemes with existing distributions, but people have previously expressed a preference for location and scale

No problem! I'll also rename nu to df (degree of freedom) then.

What do you think about having aliases for distributions params?

params: [                                                                                                                
  {name: 'mu', aliases:['loc', 'location', 'μ'] /*...*/},
  {name: 'sigma', aliases:['scale', 'sd', 'σ'] /*...*/}
]

To have multiple styles working

Gaussian({mu: ..., sigma: ...})
Gaussian({μ: ..., σ: ...})

To implement that we could update the dist constructor in base.ad.js to map aliases from user provided param object to internal this.param. For example {loc: 0, scale: 1} to {mu: 0, sigma: 1}.

from webppl.

zemlyansky avatar zemlyansky commented on July 26, 2024

I can do this myself once the PR is merged, but feel free to take a look if you like.

It'd be awesome if you finish the variational inference logic after merging the PR. I don't really know how to do that right. Thank you!

from webppl.

null-a avatar null-a commented on July 26, 2024

What do you think about having aliases for distributions params?

I agree that we'd need to do something like this if we wanted to get rid of the current inconsistencies (e.g. Gaussian uses mu not location), but I think having just location and scale is OK for new distributions. One concern would be the overhead incurred every time a distribution object is created, but this kind of thing tends to be negligible in JS, so I imagine this would be OK. (Related: #630)

from webppl.

null-a avatar null-a commented on July 26, 2024

It'd be awesome if you finish the variational inference logic after merging the PR.

Sure, no problem.

and use the value of nu from the choice in the model

For my own reference: I'm not confident that my earlier suggestion will always be correct. A safer thing to do would be to pick some fixed nu to use for default guides. This isn't ideal, as user's would need to guide manually to specify a different nu, and this won't be obvious. Is it possible that guiding with some other family (probably Gaussian) would be a better default than StudentT with fixed nu? Also, if we do guide with StudentT by default, can we hook up nu to a variational parameter when not using reparameterisation?

from webppl.

zemlyansky avatar zemlyansky commented on July 26, 2024

Would you be able to rename mu -> location and sigma -> scale

Done. Also renamed nu to df (degrees of freedom) and updated the sampler test with new params. Probably ready to merge

from webppl.

null-a avatar null-a commented on July 26, 2024

Thanks, I've merged the PR. I decided to remove reparameterization altogether, since I wasn't confident my proposal was correct.

from webppl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.