Giter Site home page Giter Site logo

nlpmodelsjump.jl's Introduction

NLPModelsJuMP

NLPModelsJuMP.jl provides conversion from JuMP.jl / MathOptInterface.jl models to NLPModels.

How to Cite

If you use NLPModelsJuMP.jl in your work, please cite using the format given in CITATION.bib.

Documentation Linux/macOS/Windows/FreeBSD Coverage DOI
docs-stable docs-dev build-gh build-cirrus codecov doi

See the documentation on NLPModels for the description of its API. Here, we focus on the use of JuMP to create MathOptNLPModel and MathOptNLSModel.

Disclaimer: NLPModelsJuMP is not developed or maintained by the JuMP developers.

Bug reports and discussions

If you think you found a bug, feel free to open an issue. Focused suggestions and requests can also be opened as issues. Before opening a pull request, start an issue or a discussion on the topic, please.

If you want to ask a question not suited for a bug report, feel free to start a discussion here. This forum is for general discussion about this repository and the JuliaSmoothOptimizers organization, so questions about any of our packages are welcome.

nlpmodelsjump.jl's People

Contributors

abelsiqueira avatar amontoison avatar blegat avatar dpo avatar github-actions[bot] avatar juliatagbot avatar monssaftoukal avatar odow avatar tmigot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nlpmodelsjump.jl's Issues

MOI Optimizer

I am wondering whether it would be possible to make something like this work

model = Model(NLPModelsJuMP.Optimizer)
set_attribute(model, "solver", percival)
# ...
optimize!(model)

For this, we would need to parse the whole model at the MOI level but looking at the code, it doesn't seem to hard to change this.

Let me know what you think.

Track allocations in a MathOptNLP(NLS)Model

I ran the following script tracking allocations:

using Pkg
Pkg.activate(".")
using JuMP, LinearAlgebra, NLPModels, NLPModelsJuMP, NLPModelsTest # main version of NLPModelsTest

nlp_problems = setdiff(NLPModelsTest.nlp_problems, ["MGH01Feas"])
nls_problems = NLPModelsTest.nls_problems
extra_nls_problems = ["HS30", "HS43", "MGH07"]

for problem in lowercase.(nlp_problems)
  include(joinpath("nlp_problems", "$problem.jl"))
end

for problem in lowercase.(nls_problems ∪ extra_nls_problems)
  include(joinpath("nls_problems", "$problem.jl"))
end

for prob in Symbol.(lowercase.(nlp_problems))
  prob_fn = eval(prob)
  nlp = MathOptNLPModel(prob_fn(), hessian = (prob != :nohesspb), name = string(prob))
  print_nlp_allocations(nlp, test_allocs_nlpmodels(nlp))
end

for prob in Symbol.(lowercase.(nls_problems ∪ extra_nls_problems))
  prob_fn = eval(prob)
  nls = prob_fn()
  print_nlp_allocations(nls, test_allocs_nlpmodels(nls))
  print_nlp_allocations(nls, test_allocs_nlsmodels(nls))
end

and got this report:

  Problem name: brownden
                        obj: ████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 16.0  
                hess_coord!: ████████████████████ 96.0  
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0   
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0   
                     hprod!: █████████████████⋅⋅⋅ 80.0  
              hess_op_prod!: █████████████████⋅⋅⋅ 80.0  

  Problem name: hs5
                        obj: ████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 16.0
                hess_coord!: ████████████████████ 96.0
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: █████████████████⋅⋅⋅ 80.0
              hess_op_prod!: █████████████████⋅⋅⋅ 80.0

  Problem name: hs6
                 hprod_lag!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                     jprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: ███⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 16.0
     jac_op_transpose_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      cons!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ███████████████⋅⋅⋅⋅⋅ 80.0
               jac_op_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_lag_coord!: ████████████████████ 112.0
          hess_lag_op_prod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ███████████████⋅⋅⋅⋅⋅ 80.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ███████████████⋅⋅⋅⋅⋅ 80.0

  Problem name: hs10
                 hprod_lag!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: ██████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                     jprod!: ██████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: ████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 32.0
     jac_op_transpose_prod!: ██████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      cons!: ██████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
               jac_op_prod!: ██████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
            hess_lag_coord!: ████████████████████ 176.0
          hess_lag_op_prod!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: ██████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: hs11
                 hprod_lag!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                     jprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: ██████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 32.0
     jac_op_transpose_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      cons!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
               jac_op_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ██████████████████⋅⋅ 96.0
            hess_lag_coord!: ████████████████████ 112.0
          hess_lag_op_prod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: hs13
                 hprod_lag!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                     jprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: ███⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 16.0
     jac_op_transpose_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      cons!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ███████████████⋅⋅⋅⋅⋅ 80.0
               jac_op_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_lag_coord!: ████████████████████ 112.0
          hess_lag_op_prod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ███████████████⋅⋅⋅⋅⋅ 80.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ███████████████⋅⋅⋅⋅⋅ 80.0

  Problem name: hs14
                 hprod_lag!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: ████████████████████ 128.0
                     jprod!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 32.0
     jac_op_transpose_prod!: ████████████████████ 128.0
                      cons!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
               jac_op_prod!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ███████████████⋅⋅⋅⋅⋅ 96.0
            hess_lag_coord!: ██████████████████⋅⋅ 112.0
          hess_lag_op_prod!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: lincon
                 hprod_lag!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     jprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                        obj: ███⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 16.0
     jac_op_transpose_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      cons!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                hess_coord!: ███████████████⋅⋅⋅⋅⋅ 80.0
               jac_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_lag_coord!: ████████████████████ 112.0
          hess_lag_op_prod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ███████████████⋅⋅⋅⋅⋅ 80.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ███████████████⋅⋅⋅⋅⋅ 80.0

  Problem name: linsv
                 hprod_lag!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                    jtprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     jprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                        obj: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 32.0
     jac_op_transpose_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      cons!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                hess_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
               jac_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      grad!: ████████████████████ 64.0
            hess_lag_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
          hess_lag_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                 jac_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: lls
                 hprod_lag!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                    jtprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     jprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                        obj: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
     jac_op_transpose_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      cons!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                hess_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
               jac_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_lag_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
          hess_lag_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                 jac_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: lls
     hess_op_residual_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
       hess_coord_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
    jac_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            jprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
        jac_coord_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
      jac_op_residual_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
jac_op_residual_transpose_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
           jtprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                  residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
   hess_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: mgh01
                        obj: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ████████████████████ 192.0
                      grad!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 96.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 80.0
              hess_op_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 80.0

  Problem name: mgh01
     hess_op_residual_prod!: ███████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
       hess_coord_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
    jac_structure_residual!: ████████████████████ 208.0
            jprod_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
        jac_coord_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
      jac_op_residual_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hprod_residual!: ███████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
jac_op_residual_transpose_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
           jtprod_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                  residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
   hess_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: nlshs20
                 hprod_lag!: ███████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                     jprod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
     jac_op_transpose_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      cons!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ████████████████████ 208.0
               jac_op_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 96.0
            hess_lag_coord!: █████████████████⋅⋅⋅ 176.0
          hess_lag_op_prod!: ███████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 96.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 96.0

  Problem name: nlshs20
     hess_op_residual_prod!: ███████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
       hess_coord_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
    jac_structure_residual!: ████████████████████ 208.0
            jprod_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
        jac_coord_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
      jac_op_residual_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hprod_residual!: ███████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
jac_op_residual_transpose_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
           jtprod_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                  residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
   hess_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: nlslc
                 hprod_lag!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     jprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                        obj: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
     jac_op_transpose_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      cons!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                hess_coord!: ████████████████████ 128.0
               jac_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                      grad!: ███████████████⋅⋅⋅⋅⋅ 96.0
            hess_lag_coord!: ██████████████████⋅⋅ 112.0
          hess_lag_op_prod!: ██████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: █████████████⋅⋅⋅⋅⋅⋅⋅ 80.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: █████████████⋅⋅⋅⋅⋅⋅⋅ 80.0

  Problem name: nlslc
     hess_op_residual_prod!: ████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 176.0
       hess_coord_residual!: ██⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
    jac_structure_residual!: ████████████████████ 880.0
            jprod_residual!: ██⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
        jac_coord_residual!: ██⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
      jac_op_residual_prod!: ██⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hprod_residual!: ████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 176.0
jac_op_residual_transpose_prod!: ██⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
           jtprod_residual!: ██⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                  residual!: ██⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
   hess_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: hs30
                 hprod_lag!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                     jprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
     jac_op_transpose_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      cons!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
               jac_op_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_lag_coord!: ████████████████████ 112.0
          hess_lag_op_prod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: hs30
     hess_op_residual_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
       hess_coord_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
    jac_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            jprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
        jac_coord_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
      jac_op_residual_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
jac_op_residual_transpose_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
           jtprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                  residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
   hess_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: hs43
                 hprod_lag!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                    jtprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                     jprod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                        obj: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
     jac_op_transpose_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      cons!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
               jac_op_prod!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                      grad!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hess_lag_coord!: ████████████████████ 112.0
          hess_lag_op_prod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 64.0
                 jac_coord!: █████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
             jac_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
              hess_op_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: hs43
     hess_op_residual_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
       hess_coord_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
    jac_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            jprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
        jac_coord_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
      jac_op_residual_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
            hprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
jac_op_residual_transpose_prod!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
           jtprod_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                  residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
   hess_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

  Problem name: mgh07
                        obj: ████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                hess_coord!: ████████████████████ 256.0
                      grad!: ████████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 96.0
            hess_structure!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0
                     hprod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 144.0
              hess_op_prod!: ████████████⋅⋅⋅⋅⋅⋅⋅⋅ 144.0

  Problem name: mgh07
     hess_op_residual_prod!: ██████████████⋅⋅⋅⋅⋅⋅ 144.0
       hess_coord_residual!: ██████████████⋅⋅⋅⋅⋅⋅ 144.0
    jac_structure_residual!: ████████████████████ 208.0
            jprod_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
        jac_coord_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
      jac_op_residual_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
            hprod_residual!: ██████████████⋅⋅⋅⋅⋅⋅ 144.0
jac_op_residual_transpose_prod!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
           jtprod_residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
                  residual!: █████⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 48.0
   hess_structure_residual!: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0.0

Bug in hessian structure

Oh, I think the problem comes from the model actually. I did the following (a bit nervous) test:

julia> using NLSProblems, NLPModels

julia> nlp = hs06()

julia> hess_structure(nlp)
([1, 267043088], [1, 167116808])

julia> hess_structure(nlp)
([1, 331019536], [1, 269359968])

julia> hess_structure(nlp)
([1, 331019536], [1, 167149984])

julia> hess_structure(nlp)
([1, 331019536], [1, 167150048])

julia> hess_structure(nlp)
([1, 5124065472], [1, 5124065472])

Originally posted by @tmigot in JuliaSmoothOptimizers/CaNNOLeS.jl#43 (comment)

Hessian should be zero/Jacobian coord?

Long post, sorry. I'm having a very specific issue related to JuliaSmoothOptimizers/NLPModels.jl#146 these lines

function MathProgBase.hesslag_structure(d::NLPModelEvaluator)
rows, cols, _ = hess_coord(d.nlp, [0.317i for i = 1:d.nlp.meta.nvar],
y=[0.618i for i = 1:d.nlp.meta.ncon])
return rows, cols
end
function MathProgBase.eval_hesslag(d::NLPModelEvaluator, H, x, σ, μ)
rows, cols, vals = hess_coord(d.nlp, x, y=μ, obj_weight=σ)
copyto!(H, vals)
end

Essentially, my model computes hess_coord with an error that sometimes vanishes. When creating the model, the structure is stored with 4 nonzeros, but during execution hess_coord returns 5 nonzeros. The problem is that the scruture has 4 elements and hess_coord now wants to store 5.

Now with more details. I have the NLSModel mgh04 and the NLS to constrained LS model. The Hessian of F3(x) = x[1] * x[2] - 2 should be [0 1; 1 0], but any of these zeros can turn into an error of order eps().
This can be recreated without NLS:

using JuMP, NLPModels, NLPModelsJuMP
model = Model()
@variable(model, x[1:2])
@NLobjective(model, Min, 0.0)
@NLconstraint(model, x[1] * x[2] == 2)
nlp = MathProgNLPModel(model)
hess_coord(nlp, [0.659; 0.702], y=[1.0])

the output is ([1, 2, 2], [1, 2, 1], [0.0, 1.11022e-16, 1.0]), which means some of the elements that should be 0 is actually close to 1e-16. Now, this doesn't produce an error, because the structure is such that these "null" elements are stored.

However, when I convert a MathProgNLSModel using NLStoCons, it loses that information and returns only the nonzero elements. This becomes a problem when I pass it to NLPtoMPB as explained above.

Now, a possibility of workaround is to change the code for eval_hesslag in nlp_to_mpb.jl and return only the vals for which rows and cols are defined in the structure.

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

Define MathOptNLSModel when Hessian is not available

The following problem

using JuMP                  
using NLPModelsJuMP         
using JSOSolvers             
model = Model()
g(x...) = 10*length(x) + sum(x[i].^2 .- 10*cos.(2π*x[i]) for i in 1:length(x))
register(model, :g, 5, g, autodiff = true)
x₀ = [1.0, 0.1, 0.2, -0.5, 1.0]
@variable(model, x[i=1:5], start = x₀[i])
@NLexpression(model, res[i in 1:5], g(x...))
nls = MathOptNLSModel(model, res, name = "NL")

returns an error LoadError: Unsupported feature Hess. Providing a Hessian through register does not work as JuMP does not support Hessians for multivariate functions. I've seen in #68 that this issue was solved for MathOptNLPModel.

Has this been considered for MathOptNLSModel?

Upcoming refactoring of JuMP's nonlinear API

The upcoming release of JuMP v1.2 will break Complementarity. Read more here: https://discourse.julialang.org/t/ann-upcoming-refactoring-of-jumps-nonlinear-api/83052

The breakage looks pretty minor, and is mostly hacks that were added to get constraint bounds

nl_lcon = nnln == 0 ? Float64[] : map(nl_con -> nl_con.lb, jmodel.nlp_data.nlconstr)
nl_ucon = nnln == 0 ? Float64[] : map(nl_con -> nl_con.ub, jmodel.nlp_data.nlconstr)
eval = jmodel.nlp_data == nothing ? nothing : NLPEvaluator(jmodel)

x-ref: jump-dev/JuMP.jl#2955

Please ping me if you have questions. I'll try to get time before JuMP 1.2 is released to update this to only use the public API.

You probably need something like

data = MOI.get(model, MOI.NLPBlock())
map(b -> b.lb, block.constraint_bounds)

Add support for quadratic constraints

It could be useful to treat separately quadratic and other nonlinear constraints.
An open question is how to store all hessian of the quadratic constraints?

  • List of 2D COO
  • 3D COO for third order tensor (rows, cols, tubes, vals)
  • ...

Repo name

I think it would make more sense for this repository to be named JuMPNLPModels, or simply JuMPModels. Opinions?

Can't use views?

julia> using OptimizationProblems
julia> using NLPModelsJuMP
julia> model = MathProgNLPModel(hs6());
julia> xx = rand(5);
julia> x = @view xx[2:3];
julia> obj(model, x)
ERROR: MethodError: no method matching forward_eval(::Array{Float64,1}, ::Array{Float64,1}, ::Array{ReverseDiffSparse.NodeData,1}, ::SparseMatrixCSC{Bool,Int64}, ::Array{Float64,1}, ::Array{Float64,1}, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}; user_operators=ReverseDiffSparse.UserOperatorRegistry(Dict{Symbol,Int64}(), MathProgBase.SolverInterface.AbstractNLPEvaluator[], Dict{Symbol,Int64}(), Any[], Any[], Any[]))
Closest candidates are:
  forward_eval(::Array{T,1}, ::AbstractArray{T,1}, ::AbstractArray{ReverseDiffSparse.NodeData,1}, ::Any, ::Any, ::Any, ::Array{T,1}, ::Any, ::Any, ::Any; user_operators) where T at /Users/dpo/.julia/v0.6/ReverseDiffSparse/src/forward.jl:20
  forward_eval(::Array{T,1}, ::AbstractArray{T,1}, ::AbstractArray{ReverseDiffSparse.NodeData,1}, ::Any, ::Any, ::Any, ::Array{T,1}, ::Any, ::Any) where T at /Users/dpo/.julia/v0.6/ReverseDiffSparse/src/forward.jl:20 got unsupported keyword argument "user_operators"
  forward_eval(::Array{T,1}, ::AbstractArray{T,1}, ::AbstractArray{ReverseDiffSparse.NodeData,1}, ::Any, ::Any, ::Any, ::Array{T,1}, ::Any) where T at /Users/dpo/.julia/v0.6/ReverseDiffSparse/src/forward.jl:20 got unsupported keyword argument "user_operators"
Stacktrace:
 [1] forward_eval_all(::JuMP.NLPEvaluator, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}) at /Users/dpo/.julia/v0.6/JuMP/src/nlp.jl:445
 [2] eval_f(::JuMP.NLPEvaluator, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}) at /Users/dpo/.julia/v0.6/JuMP/src/nlp.jl:477
 [3] obj(::NLPModelsJuMP.MathProgNLPModel, ::SubArray{Float64,1,Array{Float64,1},Tuple{UnitRange{Int64}},true}) at /Users/dpo/.julia/v0.6/NLPModelsJuMP/src/mpb_model.jl:131

MOI optimizer broke

It seems the constructor NLPModelsJuMP.Optimizer() broke with an update of one of the dependencies. When running the tests, I get

Got exception outside of a @test
  MethodError: no method matching SolverCore.GenericExecutionStats{Float64, Vector{Float64}, Vector{Float64}, Any}()
  
  Closest candidates are:
    (::Type{SolverCore.GenericExecutionStats{T, S, V, Tsp}} where {T, S, V, Tsp})(::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any, ::Any)
     @ SolverCore ~/.julia/packages/SolverCore/iOwNg/src/stats.jl:109
  
  Stacktrace:
    [1] NLPModelsJuMP.Optimizer()
      @ NLPModelsJuMP ~/.julia/dev/NLPModelsJuMP/src/MOI_wrapper.jl:10

The versions used are

(NLPModelsJuMP) pkg> st
Project NLPModelsJuMP v0.12.1
Status `~/.julia/dev/NLPModelsJuMP/Project.toml`
  [4076af6c] JuMP v1.15.1
  [b8f27783] MathOptInterface v1.20.1
  [a4795742] NLPModels v0.20.0
  [01435c0c] Percival v0.7.0
  [ff4d7338] SolverCore v0.3.7
  [37e2e46d] LinearAlgebra
  [de0858da] Printf
  [2f01184e] SparseArrays

Function tracing (JuMP 1.15.0) is not supported

A short, not minimal, example:

n = 2
jmp = Model()
σ(t) = 1 / (1 + exp(-t))
@variable(jmp, x[1:n])
@variable(jmp, y[1:n])
@objective(jmp, Min,
	sum(σ(x[i] - 1)^2 for i = 1:n) + sum(y[i]^2 for i = 1:n)
)
@constraint(jmp, [i=1:n,j=i+1:n], x[i] + x[j] == y[i] - y[j])

nlp = MathOptNLPModel(jmp)

Raises

MethodError: Cannot `convert` an object of type

MathOptInterface.ScalarNonlinearFunction to an object of type

Union{MathOptInterface.VariableIndex, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.ScalarQuadraticFunction{Float64}}

Closest candidates are:

convert(::Type{T}, !Matched::T) where T

@ Base Base.jl:64

get(::MathOptInterface.Utilities.ObjectiveContainer{Float64}, ::MathOptInterface.ObjectiveFunction{Union{MathOptInterface.VariableIndex, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.ScalarQuadraticFunction{Float64}}})@objective_container.jl:130
[email protected]:294[inlined]
get(::MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}, ::MathOptInterface.ObjectiveFunction{Union{MathOptInterface.VariableIndex, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.ScalarQuadraticFunction{Float64}}})@universalfallback.jl:550
[email protected]:869[inlined]
parser_objective_MOI(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, ::Int64)@utils.jl:328
var"#MathOptNLPModel#15"(::Bool, ::String, ::Type{NLPModelsJuMP.MathOptNLPModel}, ::JuMP.Model)@moi_nlp_model.jl:31
NLPModelsJuMP.MathOptNLPModel(::JuMP.Model)@moi_nlp_model.jl:19
top-level scope@[Local: 12](http://localhost:1234/edit?id=20cefd64-57ee-11ee-0b4e-f19cdb29ea67#)

Release notes of JuMP 1.15.0: https://jump.dev/blog/1.15.0-release/

Quadratic functions in the constraints are not picked up by MathOptNLPModel

More specifically,

model = Model()
@variable(model, x[1:2])
@objective(model, Min, sum(x .^ 4))
@constraint(model, sum(x .^ 2) == 1)
nlp = MathOptNLPModel(model)

warns with

┌ Warning: Function MathOptInterface.ScalarQuadraticFunction{Float64} is not supported.
└ @ NLPModelsJuMP ~/.julia/packages/NLPModelsJuMP/1KSlI/src/utils.jl:219

And indeed, there are no constraints in the model. Solving the problem via JuMP using NLPModels.Optimizer works.

cc. @blegat, if you can help us again.

Linear / quadratic models

If a JuMP model m has linear constraints + linear or quadratic objective, we cannot use NLPEvalaluator(m).constraints because it's an undefined reference. We can detect if we have non linear objective with eval.has_nlobj but I don't know if we have something similar for non linear constraints...

Edit :num_nl_constraints(m) works.

Combine this package with SparseMatricesCOO

Define MathOptNLPModel when hessian is not available

Trying to convert the following problem in an NLPModel:

using JuMP, NLPModelsJuMP

  model = Model()
  @variable(model, x[1:2])
  g(x::T, y::T) where {T<:Real} = x * y
  function ∇g(g::Vector{T}, x::T, y::T) where {T<:Real}
      g[1] = y
      g[2] = x
      return
  end
  register(model, :g, 2, g, ∇g)
  @NLobjective(model, Min, g(x[1], x[2]))
  nlp = MathOptNLPModel(model)

will return Unsupported feature Hess. From what I understand in MOI, when using register in the model, the hessian is not available. Hence, :Hess and :HessVec are not available for such problems in MOI.initialize.

Any suggestion to handle this case?
Maybe add a try/catch and then assume that the user won't use hess-related function.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.