We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我因为平台的问题没法安装v0.6.4的vllm,打算使用低版本的vllm配合您的代码获取qwen2-1.5B-instruct模型的最后一层的embedding,想问一下您用的vllm以及相关的torch、transformer的包的版本是多少?
The text was updated successfully, but these errors were encountered:
4个月前弄的,你看一下,照道理官方支持embeding的版本都可以。
这里还有上古时期还不兼容embedding时候的版本,你看一下。 https://github.com/WuNein/vllm4mteb/blob/main/old/old.md
Sorry, something went wrong.
四个月前差不多是v0.5.0版本吗?因为我今天试了一下0.6.0的版本似乎报错是包的版本问题
你试一下嘛,我估计你新版本就是 https://github.com/vllm-project/flash-attention 这个要手动编译一下
No branches or pull requests
我因为平台的问题没法安装v0.6.4的vllm,打算使用低版本的vllm配合您的代码获取qwen2-1.5B-instruct模型的最后一层的embedding,想问一下您用的vllm以及相关的torch、transformer的包的版本是多少?
The text was updated successfully, but these errors were encountered: