-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ROCm support #4970
Add ROCm support #4970
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Hi, I'm trying this docker compose on AMD INSTINCT MI100. When I set
When installing vllm, it will auto install packages related to nvidia, which don't work:
What should I do? |
I have not tried installing VLLM. I think it will require a driver update to run. I will try to create a basic image for the new drivers within a month. The image is currently running on ROCM 6.1, but there is a new version 6.2. |
OK, Thanks for reply. |
@fyr233 the ROCm drivers have been updated to version 6.2, and the vLLM installation error has now been fixed. Tested on AMD Radeon RX 7900 XTX. You need to change the version of the base image. |
Added a docker build with a demonstration of running the project on AMD video cards using ROCm.
Tested on AMD Radeon RX 7900 XTX.