Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove the warning in operator_warn #3910

Closed
odow opened this issue Jan 10, 2025 · 5 comments · Fixed by #3911
Closed

Remove the warning in operator_warn #3910

odow opened this issue Jan 10, 2025 · 5 comments · Fixed by #3911

Comments

@odow
Copy link
Member

odow commented Jan 10, 2025

I think we should consider removing this warning:

JuMP.jl/src/operators.jl

Lines 279 to 294 in d75a0f8

function operator_warn(model::GenericModel)
model.operator_counter += 1
if model.operator_counter > 20000
@warn(
"The addition operator has been used on JuMP expressions a large " *
"number of times. This warning is safe to ignore but may " *
"indicate that model generation is slower than necessary. For " *
"performance reasons, you should not add expressions in a loop. " *
"Instead of x += y, use add_to_expression!(x,y) to modify x in " *
"place. If y is a single variable, you may also use " *
"add_to_expression!(x, coef, y) for x += coef*y.",
maxlog = 1
)
end
return
end

This warning is quite confusing because it doesn't actually detect what particular code has the issue, and its suggestion may be totally irrelevant if the user called a method that internally uses + via a fallback.

This warning has come up 7 times since 2019 (two related in the last week):

https://discourse.julialang.org/search?q=%22The%20addition%20operator%20has%20been%20used%20on%22%20order%3Alatest

Of those, really only 1 case was actually relevant, and even then, the user didn't know how to fix it.

https://discourse.julialang.org/t/add-an-optimize-assert-optimal-functionality-in-jump-would-be-handier/124458

To be honest, I don't know what is the problem because the code is quite convoluted with a lot of indirection.

https://discourse.julialang.org/t/jump-speeding-up-constraint-violation-inspection/111303

The offending code was:

violation = value(2(τ[i,j]+τ[j,q]+τ[p,q]) - τ[i,j] - τ[i,p] - τ[j,p] - (8 + 6*sum(x[q,v,2] for v in TAXA if v(i,j,p,q))))

but the fix was first to take the value of τ and then compute the expression, not form an expression and then take the value. The warning was not helpful.

https://discourse.julialang.org/t/help-with-performance-in-jump/57874

Offending code was (whereI is a matrix of variables):

for k=1:K
    @constraint(model, I[:,k] .== sum(I,dims=2)/K)

This wasn't hitting a fast sum, but they were also recomputing the sum K times... The warning method was helpful for them to say "something is wrong" but it didn't provide any actionable help.

https://discourse.julialang.org/t/jump-warning-messages/35143

Offending code is actually pretty spot on because the code is:

M=[AffExpr(0.0) for k = 1:8]
M[1] = A[1]
for k = 2:length(A)
        M[k] = (A[k]*Disc[k]+M[k-1])
end

and the fix is

M=[AffExpr(0.0) for k = 1:8]
M[1] = A[1]
for k = 2:length(A)
    add_to_expression!(M[k], A[k] * Disc[k])
    add_to_expression!(M[k], M[k-1])
end

But the user complained only about the warning. Not that it was slow.

https://discourse.julialang.org/t/how-to-solve-large-sparse-lps-in-gurobi/26402/3

From 2019, and has Q' * X * Q. I think we do better now.

https://discourse.julialang.org/t/jump-slow-model-generation/1623

From 2017 when we didn't support dot. Fixed.

@WalterMadelim
Copy link

To be honest, I don't know what is the problem because the code is quite convoluted with a lot of indirection.

A Benders' decomposition like algorithm that is used to solve a multistage convex DRO Unit Commitment problem (potentially to global optimality, and in this case, it can) If you are interested, I can share you the doc of algorithm.

@odow
Copy link
Member Author

odow commented Jan 11, 2025

I didn't mean it as a comment on your code. I more meant that the style is hard to follow on discourse. I can't easily tell what the variables and data are, or what each function is doing, whether you wrote it, or whether, like dot is imported under a different name.

multistage convex DRO Unit Commitment problem

I assume that you've seen https://github.com/odow/SDDP.jl 😄 I'd love to know how you can claim global optimality... (SDDP.jl can find local policies for DRO unit commitment). If it's something other than a MIP-based Lipschitz cut thing, feel free to send me an email with the doc 😄

@WalterMadelim
Copy link

Lipschitz cut

Not talking about this. I initiated a chat via DMS on discourse, you can leave your email address there.

@mlubin
Copy link
Member

mlubin commented Jan 11, 2025

I agree this is a weird one-off case that's probably outlived its usefulness. We could think about a more general performance counter system for JuMP that would help debugging or identifying performance issues.

@odow
Copy link
Member Author

odow commented Jan 12, 2025

I've opened #3911 to remove the warning, and I've made a note in #3664 (comment) to add a more generic profiler.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging a pull request may close this issue.

3 participants