Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

importlib_metadata.version("torch") does not work #190

Closed
Pet222 opened this issue Jan 1, 2022 · 5 comments
Closed

importlib_metadata.version("torch") does not work #190

Pet222 opened this issue Jan 1, 2022 · 5 comments
Labels
pytorch-directml Issues in PyTorch when using its DirectML backend

Comments

@Pet222
Copy link

Pet222 commented Jan 1, 2022

Hi!

I am able to run the backend of dml (beside the already mentioned aten implementation gap)
But when I try to use the huggingface transformers it fails since the importlib_metadata.version("torch") does not work and the huggingface backend check fails. Can you confirm this behavior?

@jstoecker jstoecker added the pytorch-directml Issues in PyTorch when using its DirectML backend label Jan 10, 2022
@ryanlai2
Copy link
Contributor

Hi @Pet222, thanks for reporting this behavior. Can you provide a minimal repro python script for us to test?

@zhangxiang1993
Copy link
Member

Hi @Pet222, this is expected, because we renamed the package to pytorch-directml, so you'll need to change that to importlib_metadata.version("pytorch-directml")

@compwiztobe
Copy link

compwiztobe commented Aug 30, 2022

The package is called pytorch-directml, but still provides the module "torch". If import torch works (or importlib.util.find_spec, etc.), shouldn't importlib_metadata.version("torch") also?

Here's the code in Huggingface: link

It correctly finds that the "torch" module is importable with pytorch-dml installed, but cannot find the version number. I also came here thinking this would be the place to fix this, (pytorch-directml package providing additional metadata for "torch" version on sys.path or sth like that), rather than having huggingface (and probably many other projects) check for any of several different package names providing the torch module. Here's a relevant issue on importlib suggesting that this is already supported, just depending on the package providing the appropriate metadata.

@smk2007
Copy link
Member

smk2007 commented Dec 16, 2022

Hi @Pet222 and @compwiztobe, torch-directml has moved to a plugin model. The plugin model resolved many of the dependency issues that could cause issues like you saw above. Check out the latest release here: torch-directml.

Please take a look and reactivate if you still face any issues.

Thanks!

@smk2007 smk2007 closed this as completed Dec 16, 2022
@Pet222
Copy link
Author

Pet222 commented Dec 20, 2022

sounds great. Honestly I stopped using torch-directML since a lot of implementation was missing back than (eg ATEN) and now I have nvidia GPU so I can directly use it with pytorch....Therefore I will not set up the environment again to confirm this so closing this ticket is fine on my end (assuming a test is now implemented in torch-directML CI for this "importlib_metadata.version("torch")" and it passes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pytorch-directml Issues in PyTorch when using its DirectML backend
Projects
None yet
Development

No branches or pull requests

6 participants