-
Notifications
You must be signed in to change notification settings - Fork 301
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
importlib_metadata.version("torch") does not work #190
Comments
Hi @Pet222, thanks for reporting this behavior. Can you provide a minimal repro python script for us to test? |
Hi @Pet222, this is expected, because we renamed the package to pytorch-directml, so you'll need to change that to importlib_metadata.version("pytorch-directml") |
The package is called pytorch-directml, but still provides the module "torch". If Here's the code in Huggingface: link It correctly finds that the "torch" module is importable with pytorch-dml installed, but cannot find the version number. I also came here thinking this would be the place to fix this, (pytorch-directml package providing additional metadata for "torch" version on sys.path or sth like that), rather than having huggingface (and probably many other projects) check for any of several different package names providing the torch module. Here's a relevant issue on importlib suggesting that this is already supported, just depending on the package providing the appropriate metadata. |
Hi @Pet222 and @compwiztobe, torch-directml has moved to a plugin model. The plugin model resolved many of the dependency issues that could cause issues like you saw above. Check out the latest release here: torch-directml. Please take a look and reactivate if you still face any issues. Thanks! |
sounds great. Honestly I stopped using torch-directML since a lot of implementation was missing back than (eg ATEN) and now I have nvidia GPU so I can directly use it with pytorch....Therefore I will not set up the environment again to confirm this so closing this ticket is fine on my end (assuming a test is now implemented in torch-directML CI for this "importlib_metadata.version("torch")" and it passes. |
Hi!
I am able to run the backend of dml (beside the already mentioned aten implementation gap)
But when I try to use the huggingface transformers it fails since the importlib_metadata.version("torch") does not work and the huggingface backend check fails. Can you confirm this behavior?
The text was updated successfully, but these errors were encountered: