Giter Site home page Giter Site logo

Comments (3)

chjackson avatar chjackson commented on July 19, 2024 1

Thanks - I've updated the comment above for optimHess. See also the examples I referred to in optim.R in the msm source.

I am not experienced with Julia, but I've had "proof of concept" success with the JuliaConnectoR package for calling a Julia routine from R while touching Julia as little as possible.

from msm.

chjackson avatar chjackson commented on July 19, 2024

I've attempted to write some documentation here for how to add a new optimisation function to msm. If any of these steps go wrong for you, I'll refine these instructions. It is easier than you might think, because you don't need to download or edit the msm source code.

Suppose your external optimisation function is called NEWOPTFUNCTION. If you call msm(..., opt.method="NEWOPTFUNCTION"), then msm will look for a function called msm.optim.NEWOPTFUNCTION in your search path which connects this optimiser to msm. Then everything is handled automatically from there.

So you just need to write this function, based on the following template:

msm.optim.NEWOPTFUNCTION <- function(p, gr, hessian, msmdata, qmodel, qcmodel, cmodel, hmodel, ...) {
  optim.args <- list(...)
  optim.args <- c(optim.args,
                  list(# Supply the required arguments to NEWOPTFUNCTION here,
                       # The function to be optimised is lik.msm,
                       # The corresponding gradient function is gr (can ignore if the optimiser doesn't use gradients)
                       # The initial values are in p$inits
                       msmdata=msmdata, qmodel=qmodel, qcmodel=qcmodel,
                       cmodel=cmodel, hmodel=hmodel, paramdata=p))
  p$opt <- do.call(NEWOPTFUNCTION, optim.args)

  ## Fill in the following with the appropriate outputs of NEWOPTFUNCTION:
  # minimised -2 log likelihood
  p$lik <-                    

  # parameter estimates, including only those being optimised, ignoring those fixed at their initial values
  p$params[p$optpars] <- 

  p$opt$par <- p$params[p$optpars]
  p$opt$hessian <-        # the Hessian.  
  #  if NEWOPTFUNCTION does not return the Hessian, could calculate it numerically with: 
  #  p$opt$hessian <- stats::optimHess(par=p$opt$par, fn=lik.msm, gr=gr,
                                 msmdata=msmdata, qmodel=qmodel, qcmodel=qcmodel,
                                 cmodel=cmodel, hmodel=hmodel, paramdata=p)

  p
}

To see examples of how this function is written in practice, look at msm.optim.bobyqa (which uses minqa::bobyqa as the optimiser) and msm.optim.nlm (which uses stats::nlm) in optim.R.

You could also include error handling, e.g. to catch any error code returned by NEWOPTFUNCTION if it doesn't converge, and give an informative message.

Out of interest, what is the optimiser you are using? It may be worth making your msm.optim.NEWOPTFUNCTION available more widely if it is more reliable. On the other hand, if you are having to fiddle with optimisers to get results, then consider whether a simpler model will suit your purpose - beware of trying to squeeze information out of your data that isn't there.

from msm.

marc-vaisband avatar marc-vaisband commented on July 19, 2024

Thank you for your reply! The optimisation routine I'm using is actually in Julia - an adaptive differential evolution algorithm provided from the BlackBoxOptim.jl package -, where I'm executing all R code in an R environment provided via the RCall.jl package. (The reverse would presumably also be possible by grabbing this optimisation package through JuliaCall). For the moment, I'll try to just write a msm.optim.dummy routine which returns the hard-coded best parameters / objective function value.

For using stats::optimHess for Hessian estimation, what would the call to msm:::lik.msm look like? Can the gradient as passed via gr be used immediately inside the optimHess call?

from msm.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.