Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ModuleNotFoundError: No module named 'transformers' #67

Open
tasteitslight opened this issue Apr 5, 2023 · 6 comments
Open

ModuleNotFoundError: No module named 'transformers' #67

tasteitslight opened this issue Apr 5, 2023 · 6 comments

Comments

@tasteitslight
Copy link

Anyone else experienced this?

When running:

python -m llama.download --model_size 7B

Error log shows:

Traceback (most recent call last): File "/Users/willbeing/miniconda/envs/test_snakes/lib/python3.9/runpy.py", line 188, in _run_module_as_main mod_name, mod_spec, code = _get_module_details(mod_name, _Error) File "/Users/willbeing/miniconda/envs/test_snakes/lib/python3.9/runpy.py", line 111, in _get_module_details __import__(pkg_name) File "/Users/willbeing/miniconda/envs/test_snakes/lib/python3.9/site-packages/llama/__init__.py", line 1, in <module> from .generation import LLaMA File "/Users/willbeing/miniconda/envs/test_snakes/lib/python3.9/site-packages/llama/generation.py", line 8, in <module> from llama.tokenizer import Tokenizer File "/Users/willbeing/miniconda/envs/test_snakes/lib/python3.9/site-packages/llama/tokenizer.py", line 9, in <module> from transformers.tokenization_utils import PreTrainedTokenizer ModuleNotFoundError: No module named 'transformers'

@anentropic
Copy link

same for me, it is specified in extras: quant requirements

so it seems you should have to pip install pyllama[quant] for the download script to work

@anentropic
Copy link

However, if I start fresh and add just pip install pyllama[quant]...

That fails to install a package named gptq:

  Command ['/Users/anentropic/Library/Caches/pypoetry/virtualenvs/experiment-llama-TFcqOWFn-py3.10/bin/python', '-m', 'pip', 'install', '--use-pep517', '--disable-pip-version-check', '--prefix', '/Users/anentropic/Library/Caches/pypoetry/virtualenvs/experiment-llama-TFcqOWFn-py3.10', '--no-deps', '/Users/anentropic/Library/Caches/pypoetry/artifacts/b7/9e/b5/9d5f4df66e2043e391e661b109c39123dc3bb4e8b0173f1222ac6e70ac/gptq-0.0.3.tar.gz'] errored with the following return code 1, and output:
  Processing /Users/anentropic/Library/Caches/pypoetry/artifacts/b7/9e/b5/9d5f4df66e2043e391e661b109c39123dc3bb4e8b0173f1222ac6e70ac/gptq-0.0.3.tar.gz
    Installing build dependencies: started
    Installing build dependencies: finished with status 'done'
    Getting requirements to build wheel: started
    Getting requirements to build wheel: finished with status 'error'
    error: subprocess-exited-with-error

    × Getting requirements to build wheel did not run successfully.
    │ exit code: 1
    ╰─> [17 lines of output]
        Traceback (most recent call last):
          File "/Users/anentropic/Library/Caches/pypoetry/virtualenvs/experiment-llama-TFcqOWFn-py3.10/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 351, in <module>
            main()
          File "/Users/anentropic/Library/Caches/pypoetry/virtualenvs/experiment-llama-TFcqOWFn-py3.10/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 333, in main
            json_out['return_val'] = hook(**hook_input['kwargs'])
          File "/Users/anentropic/Library/Caches/pypoetry/virtualenvs/experiment-llama-TFcqOWFn-py3.10/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 118, in get_requires_for_build_wheel
            return hook(config_settings)
          File "/private/var/folders/w1/_vgkxyln4c7bk8kr29s1y1k00000gn/T/pip-build-env-1ymzunpd/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 338, in get_requires_for_build_wheel
            return self._get_build_requires(config_settings, requirements=['wheel'])
          File "/private/var/folders/w1/_vgkxyln4c7bk8kr29s1y1k00000gn/T/pip-build-env-1ymzunpd/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 320, in _get_build_requires
            self.run_setup()
          File "/private/var/folders/w1/_vgkxyln4c7bk8kr29s1y1k00000gn/T/pip-build-env-1ymzunpd/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 484, in run_setup
            super(_BuildMetaLegacyBackend,
          File "/private/var/folders/w1/_vgkxyln4c7bk8kr29s1y1k00000gn/T/pip-build-env-1ymzunpd/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 335, in run_setup
            exec(code, locals())
          File "<string>", line 2, in <module>
        ModuleNotFoundError: No module named 'torch'
        [end of output]

    note: This error originates from a subprocess, and is likely not a problem with pip.
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> See above for output.

It complains about No module named 'torch' but even explicitly installing PyTorch first does not seem to fix it

So it might be better to just pip install pyllama transformers

@anentropic
Copy link

There are some more elaborate install instructions for gptq on their homepage: https://pypi.org/project/gptq/

@NicolasIRAGNE
Copy link

pip install transformers fixed this for me

@Celppu
Copy link

Celppu commented May 10, 2023

I get
myenv2\lib\site-packages\itree\__init__.py", line 7, in <module> import _itree ModuleNotFoundError: No module named '_itree'

@HireTheHero
Copy link

Same here. Maybe specify operational version and raise a small PR to include it on requirements.txt?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants