Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Installation]: Installing vLLM 0.6.6.post1 fails on TPU #11811

Closed
1 task done
BabyChouSr opened this issue Jan 7, 2025 · 2 comments
Closed
1 task done

[Installation]: Installing vLLM 0.6.6.post1 fails on TPU #11811

BabyChouSr opened this issue Jan 7, 2025 · 2 comments
Labels
installation Installation problems

Comments

@BabyChouSr
Copy link

BabyChouSr commented Jan 7, 2025

Your current environment

My python environment is 3.11.

How you are installing vllm

I have a docker image with the following commands:

RUN cd /opt/vllm && curl -sLO "https://github.com/vllm-project/vllm/archive/refs/tags/v${VLLM_VERSION}.zip" && unzip v${VLLM_VERSION}.zip

WORKDIR /opt/vllm/vllm-${VLLM_VERSION}
RUN pip uninstall torch torch-xla -y
RUN sudo apt-get install libopenblas-base libopenmpi-dev libomp-dev -y
RUN pip install -r requirements-tpu.txt
RUN VLLM_TARGET_DEVICE="tpu" python3 setup.py develop

I get the following error:

 => ERROR [24/38] RUN pip install -r requirements-tpu.txt                                                                                                                                                  1.2s
------                                                                                                                                                                                                          
 > [24/38] RUN pip install -r requirements-tpu.txt:                                                                                                                                                             
#0 0.902 Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/nightly/cpu                                                                                                              
#0 0.902 Looking in links: https://storage.googleapis.com/libtpu-releases/index.html, https://storage.googleapis.com/jax-releases/jax_nightly_releases.html, https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html                                                                                                                                                                                           
#0 0.903 Ignoring fastapi: markers 'python_version < "3.9"' don't match your environment                                                                                                                        
#0 0.904 Ignoring six: markers 'python_version > "3.11"' don't match your environment
#0 0.904 Ignoring setuptools: markers 'python_version > "3.11"' don't match your environment
#0 0.910 ERROR: torch_xla-2.6.0.dev20241126-cp310-cp310-linux_x86_64.whl is not a supported wheel on this platform.

I assume this is because the torch xla version is for python 3.10. I wonder if there is a solution to this by changing the requirements tpu txt to not hardcode the 3.10 version?

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@BabyChouSr BabyChouSr added the installation Installation problems label Jan 7, 2025
@robertgshaw2-redhat
Copy link
Collaborator

robertgshaw2-redhat commented Jan 7, 2025

Fixed by #11695

@BabyChouSr
Copy link
Author

Sorry for not seeing this earlier! and thank you for the quick fix!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
installation Installation problems
Projects
None yet
Development

No branches or pull requests

2 participants