Experiments using Automatic Differentiation with DiffSharp for Maximum Likelihood Estimation.
See blog post.
Exploring function minimization using the built-in SGD and Adam optimizers.
- Terminate gradient descent with built-in optimizers
- Using built-in optimizers with functions taking mixed types (ex: bool and float)
- Built-in optimizers: set objective to minimization