-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bye bye torch <2 #3331
Bye bye torch <2 #3331
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice ! LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR, I think it's a good idea to finally make the switch.
I searched the code base for potential references to older PyTorch versions that were missed in this PR. What I found:
- PyTorch version (GPU?): 1.12.0+cu102 (True) |
accelerate/docs/source/usage_guides/sagemaker.md
Lines 148 to 149 in 54370d4
py_version: py38 | |
pytorch_version: 1.10.2 |
Also old Python version. Doesn't really matter much, I'd also be okay with completely removing the examples or putting placeholder values there, as they will keep getting outdated.
1. We strongly recommend to install PyTorch >= 1.13 (nightly version at the time of writing) on your MacOS machine. |
Could be updated to recommend >= 2.0
accelerate/src/accelerate/data_loader.py
Lines 60 to 65 in 54370d4
# kwargs added after by version | |
_PYTORCH_DATALOADER_ADDITIONAL_KWARGS = {} | |
for v, additional_kwargs in _PYTORCH_DATALOADER_ADDITIONAL_KWARGS.items(): | |
if is_torch_version(">=", v): | |
_PYTORCH_DATALOADER_KWARGS.update(additional_kwargs) |
Not directly related to this PR, but this looks like dead code now.
accelerate/src/accelerate/data_loader.py
Line 792 in 54370d4
if is_torch_version(">=", "2.0.1"): |
Can we assume that if users are on 2.0, they will be using the latest patch release? Then we could remove this check too.
accelerate/src/accelerate/test_utils/testing.py
Lines 335 to 339 in 54370d4
def require_fsdp(test_case): | |
""" | |
Decorator marking a test that requires FSDP installed. These tests are skipped when FSDP isn't installed | |
""" | |
return unittest.skipUnless(is_torch_version(">=", "1.12.0"), "test requires torch version >= 1.12.0")(test_case) |
Could be removed completely. Although elsewhere, the min torch version for FSDP is defined as 2.1.0, so I'm not quite sure.
accelerate/src/accelerate/utils/constants.py
Lines 32 to 34 in 54370d4
SAGEMAKER_PYTORCH_VERSION = "1.10.2" | |
SAGEMAKER_PYTHON_VERSION = "py38" | |
SAGEMAKER_TRANSFORMERS_VERSION = "4.17.0" |
Is this outdated?
accelerate/src/accelerate/utils/imports.py
Lines 391 to 392 in 54370d4
if is_torch_version("<=", "1.12"): | |
return False |
accelerate/src/accelerate/accelerator.py
Lines 1611 to 1612 in 54370d4
if not is_torch_version(">=", "2.0"): | |
raise ValueError("Using `torch.compile` requires PyTorch 2.0 or higher.") |
accelerate/tests/test_utils.py
Line 198 in 54370d4
@require_torch_min_version(version="2.0") |
accelerate/tests/test_utils.py
Line 242 in 54370d4
@require_torch_min_version(version="2.0") |
accelerate/tests/test_utils.py
Line 314 in 54370d4
@require_torch_min_version(version="1.12") |
Can be removed.
accelerate/src/accelerate/utils/modeling.py
Lines 585 to 602 in 54370d4
def _get_named_parameters(module: torch.nn.Module, prefix="", recurse=True, remove_duplicate: bool = True): | |
""" | |
Help yield various names + members of modules. Copied from PyTorch `torch.nn.Module.named_modules` for | |
compatability with torch < 2.0 versions with `remove_duplicate` option added. | |
""" | |
memo = set() | |
modules = ( | |
_get_named_modules(module, prefix=prefix, remove_duplicate=remove_duplicate) if recurse else [(prefix, module)] | |
) | |
for module_prefix, module in modules: | |
members = module._parameters.items() | |
for k, v in members: | |
if v is None or v in memo: | |
continue | |
if remove_duplicate: | |
memo.add(v) | |
name = module_prefix + ("." if module_prefix else "") + k | |
yield name, v |
If I understand the docstring correctly, this whole function can be removed.
Are you copying my tone 😄 |
It's a wonderful tone, if I do say so myself @ydshieh 😉 |
re; patch, it came out a few months later so. 2.0.0 still sees pip installs so for now I'm opting just for that |
@BenjaminBossan believe I've hit everything now + updated the torch dl kwargs with what's upcoming in 2.6 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the update, all my points are addressed.
What does this PR do?
Drops torch<2 support
Fixes # (issue)
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@BenjaminBossan @SunMarc