[Bug]: ValueError: The checkpoint you are trying to load has model type cohere2
but Transformers does not recognize this architecture.
#11299
Labels
bug
Something isn't working
Your current environment
The output of `python collect_env.py`
Model Input Dumps
No response
🐛 Describe the bug
I'm unable to run the model CohereForAI/c4ai-command-r7b-12-2024 with latest container version of vLLM (vllm/vllm-openai:v0.6.5). I get an ValueError:
The checkpoint you are trying to load has model type
cohere2
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.In #11203 a newer version of the Transformers library is mentioned: >= 4.48.0, but in the container has version 4.47.1 (see report).
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: