Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make use of learning rate scheduler optional #449

Merged
merged 1 commit into from
Oct 5, 2022

Conversation

amorehead
Copy link
Contributor

@amorehead amorehead commented Sep 28, 2022

What does this PR do?

Made trainer.configure_optimizers() robust to an unspecified learning rate scheduler.

Before submitting

  • Did you make sure title is self-explanatory and the description concisely explains the PR?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you test your PR locally with pytest command?
  • Did you run pre-commit hooks with pre-commit run -a command?

Did you have fun?

Yep 🙃

* Made `trainer.configure_optimizers()` robust to unspecified learning rate schedulers
@amorehead amorehead marked this pull request as draft October 5, 2022 22:14
@amorehead
Copy link
Contributor Author

@ashleve, LGTY?

@amorehead amorehead marked this pull request as ready for review October 5, 2022 22:15
@ashleve
Copy link
Owner

ashleve commented Oct 5, 2022

@amorehead Thank you for the ping!

I like this improvement.

Copy link
Owner

@ashleve ashleve left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@ashleve ashleve merged commit ad0f46c into ashleve:main Oct 5, 2022
@ashleve ashleve added enhancement New feature or request refactoring labels Oct 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request refactoring
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants