Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test regularizationtools v0.6 #107

Merged
merged 4 commits into from
Mar 8, 2022
Merged

Conversation

ChrisRackauckas
Copy link
Member

No description provided.

@ChrisRackauckas
Copy link
Member Author

@jjstickel can you help with this update?

@jjstickel
Copy link
Contributor

Sure. Some of the numerical minimization methods changed in RegularizationTools that are probably causing some slight differences in the results. I'll try to take a look sometime in the next few days.

@jjstickel
Copy link
Contributor

@ChrisRackauckas I've made changes to work with RegularizationTools-0.6 in a local branch pulled from your branch here. However, I do not have permission to commit back to this repository. Do I push the branch to my fork and make a new pull request? This is my first time trying to contribute to pull request initiated by someone else.

@jjstickel
Copy link
Contributor

@ChrisRackauckas, I created PR #108 to bring in my changes to your branch and this PR. Please review.

revisions for RegularizationTools 0.6
@@ -37,33 +37,33 @@ tolerance = 1e-6
@test isapprox(A.û, ans, rtol=tolerance)
# L-curve to determine λ
A = RegularizationSmooth(uₒ,tₒ; alg=:L_curve)
@test isapprox(A.λ, 0.3339750913481082, rtol=tolerance)
ans = [0.3038273346447746 0.3695574797289306 0.7261740581164446 0.8945169297047363 0.9100773683859941 0.9048101771171335 0.8663722295718835 0.8188893880450858 0.7520030529566242 0.641893306140736 0.35912131844587486 0.12553463033078033 -0.45446129364457777 -0.8246966349034558 -1.118321479210823]'
@test isapprox(A.λ, 0.9536286111306728, rtol=tolerance)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jjstickel Is it expected that λ changes so much?

Copy link
Contributor

@jjstickel jjstickel Mar 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the review. It turns out that the example I chose here is not the most robust, and neither is the L-curve method for finding the optimal lambda. So, switching to optimizing over log(lambda) (upstream change) resulted in a different optimal lambda.

ans = [0.25185799867326253 0.32130333169345704 0.6679088195845279 0.927260505998722 0.9914594632495953 1.0028090695917116 0.9749355187531624 0.9254679051053563 0.8291064785367862 0.6493316082309926 0.2324329561930743 0.07635651700236243 -0.5564454824473795 -0.817911933888329 -1.1130193817730065]'
# arbitrary weights for wls (and fixed λ, GCV not working well for some of these)
A = RegularizationSmooth(uₒ,tₒ, nothing, collect(1:npts); λ=1e-1, alg=:fixed)
# @test isapprox(A.λ, 0.1447077651169621, rtol=tolerance)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should the value be updated instead of the test commented out?

Copy link
Contributor

@jjstickel jjstickel Mar 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly, optimization via GCV(lambda) with these weight vectors is now finding a lower value for lambda that is not likely what most data scientists would want. Hence, I used fixed lambda now for these couple of tests. The next addition I plan to make (when I have time again) will be to allow the user to provide min and max values for lambda (rather than an initial guess). Then a minimum value can be specified to prevent GCV from settling on the impractically small value of lambda.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then I'd mark these as @test_broken for the time being. That makes it more visible that they should be revisited.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not yet familiar with @test_broken. Is the utility that it will still let you know if an exception occurs?

Even so, I'm not sure that's the right thing to do here. The previous test will never be correct now, and it is not necessarily wrong to test here with fixed lambda. Even with my planned updates, this test would still be valid as well, although I expect to change it when those updates occur. But if you still prefer @test_broken, I am happy to make that change -- let me know.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the utility that it will still let you know if an exception occurs?

It's a way to track if a test returns false. It will show up in the summary and it will throw if the test suddenly passes. It's not for testing exceptions though. For that you'd use @test_throws

Either make it a passing test, a @test_broken, or delete the test if it's not relevant. My objection is to having the test commented out.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it. I made another PR to this branch (#109).

@test isapprox(A.λ, 0.01920769986569919, rtol=tolerance)
ans = [0.16161677980889166 0.34630920002600707 0.6469896059918103 0.9219255811754361 1.0001382485019457 1.0260666135098766 1.0278670301547874 0.9823839714037528 0.8512856706983483 0.6416080323126064 0.1892473613483678 0.10264529505906245 -0.5680977857976282 -0.8075007645180948 -1.1168524120952026]'
A = RegularizationSmooth(uₒ,tₒ, nothing, wls, wr; λ=1e-1, alg=:fixed)
# @test isapprox(A.λ, 0.01920769986569919, rtol=tolerance)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same answer as above.

@ChrisRackauckas ChrisRackauckas merged commit ee5071d into master Mar 8, 2022
@ChrisRackauckas ChrisRackauckas deleted the ChrisRackauckas-patch-1 branch March 8, 2022 21:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants