-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RFC: import NLP interface from MathProgBase with minor modifications #202
Conversation
I am curious about how NLP + complementarity will be supported. Is it going to be some reformulations, I guess? |
@kaarthiksundar, @jac0320, @Wikunia, all of our JuMP solvers will need to be updated to this new NLP API to support JuMP v0.19. Now is the time to review this slightly revised NLP API and share any insights/suggestions you might have. Also a good time to bring up minor feature requests that would be helpful to our solvers. CC @rb004f, @harshangrjn, @ad2476, @hhijazi |
One thing we noticed recently in MPB is that KNITRO has a |
@chkwon, this API provides a way to specify an NLP with complementarity constraints in JuMP and send it directly to a solver. It's up to the solver to decide if/how to support them. The syntax would look something like: ... build nonlinear model
@constraint(m, [x,y] in Complements()) where you'd define @ccoffrin, the equivalent of |
Why don't we do something like abstract type AbstractScalarNLPEvaluator <: AbstractNLPEvaluator end
abstract type AbstractVectorNLPEvaluator <: AbstractNLPEvaluator end
struct ScalarNonLinearFunction
evaluator::AbstractScalarNLPEvaluator
end
struct VectorNonLinearFunction
evaluator::AbstractVectorNLPEvaluator
end The former has a gradient and can be used for the objective while the latter has a hessian and can be used for the constraints. eval(::AbstractNLPEvaluator)
eval_grad(::AbstractScalarNLPEvaluator)
eval_jac(::AbstractVectorNLPEvaluator)
expr(::AbstractScalarNLPEvaluator)
expr(::AbstractVectorNLPEvaluator, i) The constraints can then be added with |
@blegat, that's heading in the opposite direction from where I'd like to the nonlinear interface to go. An ideal nonlinear interface would pass only expressions to the solver, and solvers would call out to AD libraries to compute derivatives if they want to. The only time evaluator callbacks are needed are for user-defined functions. Given that nobody's taken the lead on this, we're porting over the MPB interface with no big structural changes. |
To give a bit more reasoning for why I'd like to do that, removing the AD from the JuMP side and having expressions as first-class MOI objects would make it easier to write solvers that manipulate expressions. Currently there's no super fast path to do this since it involves converting from JuMP's internal expression representation to |
I agree with @mlubin (and it's why I wanted to delay porting over the old NLP interface to MOI). I plan to work on solvers that use and manipulate the expressions and don't care about the derivatives. It makes sense for solvers to call AD if they rely on it, and not call it if they don't. |
I can confirm that the LANL solvers do a lot of expression analisys and
manipulation at the MPB level.
…On Thu, Feb 1, 2018 at 6:03 PM Chris C. ***@***.***> wrote:
I agree with @mlubin <https://github.com/mlubin> (and it's why I wanted
to delay porting over the old NLP interface to MOI). I plan to work on
solvers that use and manipulate the expressions and don't care about the
derivatives. It makes sense for solvers to call AD if they rely on it, and
not call it if they don't.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#202 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABDequ0MIvgwYV2WE9F_p6pUoXdwqST4ks5tQl7agaJpZM4RwDyq>
.
|
I agree with you. The proposal in my comment, just like this PR, is meant to be transitional and to be replaced by expressions (e.g. by replacing the |
@blegat your proposal requires a substantial restructuring of JuMP's AD |
Why would it require a different structure than yours ? JuMP could still put all nlp constraints in a single vector function so both proposal require the same structure in JuMP. Instead of having a single evaluator and dispatching between objective and constraints with the suffix of the functions, there is 2 evaluators. One advantage of having nlp functions as MOI AbstractFunction is that it requires no change to MOIU. |
What are the changes needed in MOIU to support this? I see the variable mapping as one point that may need to be wrapped, but that needs to happen regardless of if NLP data is passed as an attribute or as constraints and objectives. |
You are right, I forgot about variable mapping. As it is an attribute, there might be no change needed indeed. |
Definitely constraint+objective |
I did another round of renaming towards more explicit names. I'll merge soon barring additional comments. |
src/nlp.jl
Outdated
All subsequent references to the vector ``x`` follow this index mapping; the | ||
`i`th index of ``x`` corresponds to `variable_order[i]`. | ||
""" | ||
function initialize end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pedantic proposal: should this perhaps be initialize!
?
Given that there's been no movement on designing a nice MOI interface for NLP, I think the practical step forward is to import the interface from MPB basically as-is so that we can get NLP solvers working under JuMP/MOI.
In the PR I add an
NLPBlock
attribute that's used to store a block of NLP constraints and optionally an objective. If a user writes down@NLobjective
in JuMP, the objective will show up in theNLPBlock
. Otherwise, it will be set withObjectiveFunction
as in normal MOI. The most significant change on the solver interface side is that, unless the solver wants to force users to use@NLobjective
and@NLconstraint
entirely, it will need to support linear+quadratic objectives and constraints added through the normal MOI interface. This will be a bit of a pain, but it also allows solvers to support NLP + conic or NLP + complementarity just by supporting the corresponding MOI constraints and anNLPBlock
.I also added the
NLPBlockDualStart
attribute that finally lets you give a dual starting point.@odow @ccoffrin @chriscoey @dpo @chkwon @sylvainmouret @abelsiqueira @adowling2 @timholy (If you're not sure why you're CC'd: the next release of JuMP will not support MPB. This API will be the new way JuMP talks to NLP solvers.)
TODO before merging: The interface still needs to be tweaked to account forThe API is now set up so that you provide an ordering of the variable indices inVectorIndex
. I'm thinking of a call toAbstractNLPEvaluator
to set the evaluation point, then we can remove thex
argument from the evaluation functions.obj_expr
andconstr_expr
would use the variable indices in:(x[i])
as well.initialize
. All other interactions with theAbstractNLPEvaluator
are based on indices 1, ..., n. (Maybeobj_expr
andconstr_expr
should be an exception?)There's also a bit of room for bikeshedding the names. (e.g.,
eval_f
toeval_objective
?)