Comments (3)
Thanks - I've updated the comment above for optimHess
. See also the examples I referred to in optim.R
in the msm source.
I am not experienced with Julia, but I've had "proof of concept" success with the JuliaConnectoR
package for calling a Julia routine from R while touching Julia as little as possible.
from msm.
I've attempted to write some documentation here for how to add a new optimisation function to msm
. If any of these steps go wrong for you, I'll refine these instructions. It is easier than you might think, because you don't need to download or edit the msm
source code.
Suppose your external optimisation function is called NEWOPTFUNCTION
. If you call msm(..., opt.method="NEWOPTFUNCTION")
, then msm
will look for a function called msm.optim.NEWOPTFUNCTION
in your search path which connects this optimiser to msm
. Then everything is handled automatically from there.
So you just need to write this function, based on the following template:
msm.optim.NEWOPTFUNCTION <- function(p, gr, hessian, msmdata, qmodel, qcmodel, cmodel, hmodel, ...) {
optim.args <- list(...)
optim.args <- c(optim.args,
list(# Supply the required arguments to NEWOPTFUNCTION here,
# The function to be optimised is lik.msm,
# The corresponding gradient function is gr (can ignore if the optimiser doesn't use gradients)
# The initial values are in p$inits
msmdata=msmdata, qmodel=qmodel, qcmodel=qcmodel,
cmodel=cmodel, hmodel=hmodel, paramdata=p))
p$opt <- do.call(NEWOPTFUNCTION, optim.args)
## Fill in the following with the appropriate outputs of NEWOPTFUNCTION:
# minimised -2 log likelihood
p$lik <-
# parameter estimates, including only those being optimised, ignoring those fixed at their initial values
p$params[p$optpars] <-
p$opt$par <- p$params[p$optpars]
p$opt$hessian <- # the Hessian.
# if NEWOPTFUNCTION does not return the Hessian, could calculate it numerically with:
# p$opt$hessian <- stats::optimHess(par=p$opt$par, fn=lik.msm, gr=gr,
msmdata=msmdata, qmodel=qmodel, qcmodel=qcmodel,
cmodel=cmodel, hmodel=hmodel, paramdata=p)
p
}
To see examples of how this function is written in practice, look at msm.optim.bobyqa
(which uses minqa::bobyqa
as the optimiser) and msm.optim.nlm
(which uses stats::nlm
) in optim.R
.
You could also include error handling, e.g. to catch any error code returned by NEWOPTFUNCTION
if it doesn't converge, and give an informative message.
Out of interest, what is the optimiser you are using? It may be worth making your msm.optim.NEWOPTFUNCTION
available more widely if it is more reliable. On the other hand, if you are having to fiddle with optimisers to get results, then consider whether a simpler model will suit your purpose - beware of trying to squeeze information out of your data that isn't there.
from msm.
Thank you for your reply! The optimisation routine I'm using is actually in Julia - an adaptive differential evolution algorithm provided from the BlackBoxOptim.jl
package -, where I'm executing all R code in an R environment provided via the RCall.jl
package. (The reverse would presumably also be possible by grabbing this optimisation package through JuliaCall
). For the moment, I'll try to just write a msm.optim.dummy
routine which returns the hard-coded best parameters / objective function value.
For using stats::optimHess
for Hessian estimation, what would the call to msm:::lik.msm
look like? Can the gradient as passed via gr
be used immediately inside the optimHess
call?
from msm.
Related Issues (20)
- Data format HOT 2
- Question about maximum likelihood of msm HOT 5
- Guidance Request on Addressing Transition Probability Estimation Issue in msm Package HOT 4
- multiple covariates HOT 7
- The question of msm HOT 5
- Implementing a semi-Markov model using 'msm' package HOT 5
- A question about very high total times using msm.totlos HOT 14
- Error in estimating transition intensities HOT 2
- Censored states in simfitted.msm HOT 1
- People with one observation shouldn't be dropped in hidden Markov models HOT 1
- Fitting simple healthy-ill-death model without recovery HOT 12
- prevalence.msm censtime question HOT 4
- Pearson goodness of fit tests fail for models with covariate interactions HOT 1
- Implementation of Markov polymorphism model with 5 states and 8 paths HOT 9
- How does MSM handle missing covariate data? HOT 1
- False convergence - Hessian is not positive definite HOT 5
- initial value in 'vmmin' is not finite HOT 2
- Observations within subjects ... are not ordered by time HOT 2
- Too many paths causing errors? HOT 2
- ns() not handled
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from msm.