-
Notifications
You must be signed in to change notification settings - Fork 709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference Bug - Too many variables to unpack #56
Comments
Thanks for spotting this @marvision-ai! There have been some changes in the |
Thanks for the great repo. Reporting the same issue. |
Thank you @samet-akcay and @ashwinvaidya17 for very fast turn around time! Is there a reason why when I want to load and infer with a
|
Hello, thank you for the great repo!
Describe the bug
I cannot run inference.
To Reproduce
Steps to reproduce the behavior:
conda install openvino-ie4py-ubuntu20 -c intel
conda install pytorch==1.8.0 torchvision==0.9.0 torchaudio==0.8.0 cudatoolkit=11.1 -c pytorch -c conda-forge
Hardware and Software Configuration
Additional context
I also noticed that when I do inference from a ckpt file it still imports openvino- i expected this not to since it should run purely in pytorch.
The text was updated successfully, but these errors were encountered: