Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix Outlines version error #2562

Closed
wants to merge 3 commits into from
Closed

Conversation

shuaills
Copy link
Contributor

@shuaills shuaills commented Dec 23, 2024

Motivation

Upgrade Outlines to lateset version, as following #2505 and #2550.

Modifications

  • Update Outlines dependency version and related adaptations

Checklist

  • Format your code according to the Contributor Guide.
  • Add unit tests as outlined in the Contributor Guide.
  • Update documentation as needed, including docstrings or example tutorials.

@shuaills shuaills requested a review from hnyls2002 as a code owner December 23, 2024 15:44
"packaging", "pillow", "prometheus-client>=0.20.0",
"psutil", "pydantic", "python-multipart",
"pyzmq>=25.1.2", "torchao>=0.7.0", "gemlite", "uvicorn", "uvloop",
"xgrammar>=0.1.6"]
srt = ["sglang[runtime_common]", "torch", "vllm>=0.6.3.post1,<=0.6.4.post1", "cuda-python", "flashinfer==0.1.6"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not upgrade directly to v0.6.4.post1 and kept vllm 0.6.3.post1 because some major clients still need to use torch 2.4, making it inconvenient to upgrade, so I made it compatible. Moreover, we only used the quantization part of vllm, and the upgrade to vllm 0.6.5 does not provide additional benefits for sglang.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

vllm 0.6.3.post1 uses torch 2.4
vllm 0.6.4.post1 uses torch 2.5
vllm 0.6.5 uses torch 2.5

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the clarification! Sticking with vllm 0.6.3.post1 for torch 2.4 compatibility makes sense. I'll close the PR.

@shuaills
Copy link
Contributor Author

shuaills commented Dec 23, 2024

After checking the dependencies, it seems better to stick with the current setup for now. sglang depends on torch 2.4, but newer vllm versions need torch 2.5, which makes things a bit tricky. Closing the PR, but details are tracked in the related issue

@shuaills shuaills closed this Dec 23, 2024
@shuaills shuaills deleted the fix-outlines branch December 29, 2024 07:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants