-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updates for PyTorch Lightning 2.0 release #266
Conversation
✅ Deploy Preview for silly-keller-664934 ready!
To edit notification comments on pull requests, go to your Netlify site settings. |
Codecov Report
Additional details and impacted files@@ Coverage Diff @@
## master #266 +/- ##
========================================
- Coverage 87.6% 87.6% -0.1%
========================================
Files 26 26
Lines 2155 2178 +23
========================================
+ Hits 1889 1908 +19
- Misses 266 270 +4
|
@pjbull tests are now passing, this is ready for your review when you have time 🙇 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me!
Configure accelerator and devices for Trainer
Updates how we set the accelerator and devices based on recent PyTorch Lightning changes.
pl.Trainer
now acceptsaccelerator
("cpu", "gpu", etc."),strategy
, anddevices
params instead ofnum_gpus
. This PR adds a utility to configure accelerator and devices based on the number of user-specified gpus.Pros:
Cons
Relevant links:
Closes #264
Closes #265
Other changes based on PTL 2.0 release
The new PTL release makes a bunch of non-backward compatible API changes. This PR also provides fixes for those and sets 2.0 as the floor.
Trainer
andTuner
(Decouple Tuner from Trainer Lightning-AI/pytorch-lightning#16462)Bonus fixes