-
-
Notifications
You must be signed in to change notification settings - Fork 399
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fast resolves for NLP #1185
Comments
This is mostly covered by #1223 except for duals and deciding what to do about incremental solves. |
Duals are working. The only remaining issue is re-implementing fast resolves which I'll drop from the 0.19 milestone. Under 0.18 and prior, if only parameter values changed between solves, then JuMP would skip much of the initial setup after the first solve. I don't know if anyone used this feature. Please ping or 👍 this thread if it's important. |
Dropping this from the 1.0 milestone. I've yet to see anyone who has said that they care and have benefited from fast NLP resolves. |
I took a look at this and started hacking some things. But it's actually a little tricky, because now that we abstract the AD from the And even if JuMP doesn't set a new NLPBlockData, solvers like Ipopt call I'll also note that this has had no comments since 2019, and only one thumbs up, so
is probably still true. The overhead of setting up the AD system obviously isn't that big of a deal. An alternative approach moving forward (that would require a fair bit more work) is probably to allow solvers like Ipopt to accept a I vote we close this. If someone asks in future, we can revisit. |
Maybe now with https://github.com/odow/SymbolicAD.jl it makes more sense to care about it ? |
Not sure. It'd still take some changes in Ipopt.jl to implement properly. I think we should wait to see someone complain that it is a bottleneck. Premature optimization and all that. |
We discussed this on the monthly nonlinear call. |
@ccoffrin wants to properly benchmark on an AC power flow problem. |
I talked with the folks at https://github.com/LAMPSPUC yesterday, and this is a problem for them with PowerModels. So yes, I've now seen it in the wild. (They want to solve a sequence of 15-minute power-flow problems over a 4 year time horizon.) It's the exact problem from the benchmark in jump-dev/Ipopt.jl#321. |
Indeed. This repeated AC Power Flow problem crops up many places in power systems research and there are similar use-cases in all infrastructure sectors and I would guess MPC as well. The at the moment frameworks like SIIP and PowerModels have built non-JuMP alternatives to hit required performance targets for these work flows. |
Running @ccoffrin's (private) examples, but with https://github.com/lanl-ansi/nlp-jump-examples/blob/main/pf/nlparameter.jl updated to use the new nonlinear syntax and parameters, we now get: julia> include("pf/base.jl"); @time bench_pf_base(case="pglib_opf_case2000_goc.m")
...
179.875041 seconds (1.05 G allocations: 62.591 GiB, 26.58% gc time)
julia> include("pf/nlsolve.jl"); @time bench_pf_nlsolve(case="pglib_opf_case2000_goc.m")
...
25.218796 seconds (56.25 M allocations: 4.750 GiB, 4.83% gc time, 0.02% compilation time)
julia> include("pf/nlparameter.jl"); @time bench_pf_nlparameter(case="pglib_opf_case2000_goc.m")
...
42.277532 seconds (40.22 M allocations: 1.928 GiB, 2.84% gc time) So we're much better than before but still a little bit left on the table. I'd need to profile where exactly the remaining time is spent, but it's no longer JuMP's fault of rebuilding the evaluator every time. I don't know if there's an easy way to test this at the JuMP level, so we could consider just closing. |
That's a really nice improvement, yey! |
Closing because there's nothing left to do here. The key bottleneck was that JuMP rebuild the nonlinear evaluator on every call to |
jump-dev/MathOptInterface.jl#202. Needs more infrastructure outside JuMP first (i.e. MOIT, MOIU, solver wrappers). We'll have to think about how we want to test this on the JuMP side.
The text was updated successfully, but these errors were encountered: