Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error when using XuanYuan-70B-int4-Chat #3365

Closed
1 task done
davidyao opened this issue Apr 21, 2024 · 1 comment
Closed
1 task done

error when using XuanYuan-70B-int4-Chat #3365

davidyao opened this issue Apr 21, 2024 · 1 comment
Labels
solved This problem has been already solved

Comments

@davidyao
Copy link

Reminder

  • I have read the README and searched the existing issues.

Reproduction

python src/train_web.py

chat mode using model XuanYuan-70B-int4-Chat

engine: huggingface

throw error when loading

/.conda/envs/llama_factory/lib/python3.10/site-packages/transformers/utils/quantization_config.py", line 532, in post_init
raise ValueError("Cannot specify both disable_exllama and use_exllama. Please use just use_exllama")
ValueError: Cannot specify both disable_exllama and use_exllama. Please use just use_exllama

Expected behavior

No response

System Info

No response

Others

No response

@hiyouga
Copy link
Owner

hiyouga commented Apr 21, 2024

fixed

@hiyouga hiyouga added the solved This problem has been already solved label Apr 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

2 participants