Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Fail to import is_torch_greater_or_equal_than_1_13 since transformers v4.48.0 for all deepseek models #290

Open
yuxianq opened this issue Jan 16, 2025 · 0 comments

Comments

@yuxianq
Copy link

yuxianq commented Jan 16, 2025

Describe the bug
Fail to import is_torch_greater_or_equal_than_1_13 since transformers v4.48.0 for all deepseek models

To Reproduce
Install transformers v4.48.0 and run any deepseek model.

Expected behavior
Can run deepseek models with transformers v4.48.0.

Additional context
is_torch_greater_or_equal_than_1_13 has been removed since transformers v4.48.0, it is necessary to remove all usage of is_torch_greater_or_equal_than_1_13 from all deepseek models, not only this one. Can the maintainers help to fix all the models? It is important for us to provide deepseek model support in TensorRT-LLM. Thanks~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant