Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

想问一下您的vllm版本是多少? #6

Open
AnthonyX1an opened this issue Jan 2, 2025 · 3 comments
Open

想问一下您的vllm版本是多少? #6

AnthonyX1an opened this issue Jan 2, 2025 · 3 comments

Comments

@AnthonyX1an
Copy link

我因为平台的问题没法安装v0.6.4的vllm,打算使用低版本的vllm配合您的代码获取qwen2-1.5B-instruct模型的最后一层的embedding,想问一下您用的vllm以及相关的torch、transformer的包的版本是多少?

@WuNein
Copy link
Owner

WuNein commented Jan 2, 2025

4个月前弄的,你看一下,照道理官方支持embeding的版本都可以。

这里还有上古时期还不兼容embedding时候的版本,你看一下。
https://github.com/WuNein/vllm4mteb/blob/main/old/old.md

@AnthonyX1an
Copy link
Author

四个月前差不多是v0.5.0版本吗?因为我今天试了一下0.6.0的版本似乎报错是包的版本问题

@WuNein
Copy link
Owner

WuNein commented Jan 2, 2025

你试一下嘛,我估计你新版本就是
https://github.com/vllm-project/flash-attention
这个要手动编译一下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants