Skip to content

MadNLP Integration#63

Open
gennadiryan wants to merge 4 commits intomainfrom
feat/madnlp-integration
Open

MadNLP Integration#63
gennadiryan wants to merge 4 commits intomainfrom
feat/madnlp-integration

Conversation

@gennadiryan
Copy link
Copy Markdown
Member

Initial setup for MadNLP integration

TODOs

  • General cleanup of DirectTrajOpt.jl test infra.
  • Resolution of TODO pertaining to bug wherein eval_constraint_jacobian_transpose_product is called by MadNLPSolver after optimization has concluded. This seems to pertain to how the NLP is constructed, as a "direct" method of specifying the NLP avoids this issue, even on larger problems. To reviewers and/or Claude: consider instantiating the optimizer in get_optimizer_and_variables only after all NLP constraints have been added to the underlying NLPBlock.
  • Wider support for the rich variety of options made available by MadNLP.jl via DirectTrajOpt.Solvers.MadNLPOptions/DirectTrajOpt.IpoptSolverExt.MadNLPOptions and development of associated notions of an AbstractOptimizer type associated to an AbstractSolverOptions type. This would be recommended if one wishes to support passing callbacks, options, etc. through a similar API to both solvers.

Further goals

  • (Fair) benchmarking of Ipopt.jl v.s. MadNLP.jl
  • Investigation of multithreading/multiprocessing in Ipopt.jl v.s. MadNLP.jl
  • Investigation of performance benefits for DirectTrajOpt.jl-style problems associated with MadNLPGPU.jl

Copy link
Copy Markdown
Member

@jack-champagne jack-champagne left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good so far, reusing IpoptEvaluator for MOI stuff is fine with me! its all MOI.AbstractNLPEvaluator anyways.

A few notes, see my other comments above ofc. Big picture thoughts are that we should move on ripping this out into a package extension for users. A have left some ideas on how to make this easy in my other comments. I have a list of things that we should move around to make this happen:

Move everything MadNLP-specific out of src/solvers/ipopt_solver/:

  • MadNLPOptions struct
  • _solve_madnlp!
  • get_optimizer_and_variables(::DirectTrajOptProblem, ::MadNLPOptions, ...)
  • set_variables!(::MadNLP.Optimizer, ...)
  • update_trajectory!(::DirectTrajOptProblem, ::MadNLP.Optimizer, ...)
  • set_options!(::MadNLP.Optimizer, ::MadNLPOptions)
  • The AbstractOptimizer union type (the Ipopt methods in constraints.jl can stay typed to Ipopt.Optimizer — the extension will add its own methods or you can widen to MOI.AbstractOptimizer)
  • The eval_constraint_jacobian_transpose_product stub
  • The MadNLP test item(s)

And a list of things that should stay in the ipopt stuff:

  • IpoptOptions, IpoptEvaluator, all Ipopt solver code
  • AbstractSolverOptions in Solvers module is good
  • The solve! method just needs its options type annotation to remain AbstractSolverOptions — dispatch will route to the right _solve! method based on the concrete type.

Comment on lines +87 to +105
function _solve!(
prob::DirectTrajOptProblem,
options::Any;
verbose::Bool = true,
callback = nothing,
kwargs...,
)
if options isa Solvers.AbstractSolverOptions
if options isa Solvers.IpoptOptions
_solve_ipopt!(prob, options; verbose = verbose, callback = callback, kwargs...)
elseif options isa Solvers.MadNLPOptions
_solve_madnlp!(prob, options; verbose = verbose, callback = callback, kwargs...)
else
@warn "Solver options not recognized"
end
else
@warn "Solver options invalid"
end
end
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The _solve! dispatcher (solver.jl:89-103) checks options isa Solvers.IpoptOptions and options isa Solvers.MadNLPOptions, but both types are defined in IpoptSolverExt, not in the Solvers module (which only exports AbstractSolverOptions and solve!).

The quickest fix is to drop the hand-rolled isa branching entirely and use Julia's dispatch:

_solve!(prob, options::IpoptOptions; kwargs...) = _solve_ipopt!(prob, options; kwargs...)
_solve!(prob, options::MadNLPOptions; kwargs...) = _solve_madnlp!(prob, options; kwargs...)

This is more type-d anyways

solve!(prob; max_iter = 100)
end

@testitem "testing MadNLP.jl solver" begin
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test calls _solve_madnlp! directly instead of going through solve!(prob; options=MadNLPOptions(), ...). We should test the public API or make another test with that exact goal.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should split off the madnlp stuff from the ipopt_solver files and also figure out how we can take the line https://github.com/harmoniqs/DirectTrajOpt.jl/pull/63/changes#diff-f19c825aceabbf5309ddaed77474ba8474413cfc2b7b3d0d950f658d7fda7dafR20 and put it in a different part of the codebase, perhaps the right stuff ends up into a dedicated package extension for MadNLP.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if we want to make this a package extension, we can start here and do something like:

[deps]
# remove MadNLP from here

[weakdeps]
MadNLP = "..."

[extensions]
MadNLPExt = "MadNLP"

and in file structure:

ext/
  MadNLPExt.jl    # or ext/MadNLPExt/MadNLPExt.jl if multiple files

the extension module file itself will have to be something like:

module MadNLPExt

using DirectTrajOpt
import DirectTrajOpt: DirectTrajOptProblem, AbstractSolverOptions, ...
import MadNLP
using MathOptInterface
import MathOptInterface as MOI

# MadNLPOptions, _solve_madnlp!, get_optimizer_and_variables, etc.

end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants