You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have read the README and searched the existing issues.
Reproduction
python src/train_web.py
chat mode using model XuanYuan-70B-int4-Chat
engine: huggingface
throw error when loading
/.conda/envs/llama_factory/lib/python3.10/site-packages/transformers/utils/quantization_config.py", line 532, in post_init
raise ValueError("Cannot specify both disable_exllama and use_exllama. Please use just use_exllama")
ValueError: Cannot specify both disable_exllama and use_exllama. Please use just use_exllama
Expected behavior
No response
System Info
No response
Others
No response
The text was updated successfully, but these errors were encountered:
Reminder
Reproduction
python src/train_web.py
chat mode using model XuanYuan-70B-int4-Chat
engine: huggingface
throw error when loading
/.conda/envs/llama_factory/lib/python3.10/site-packages/transformers/utils/quantization_config.py", line 532, in post_init
raise ValueError("Cannot specify both
disable_exllama
anduse_exllama
. Please use justuse_exllama
")ValueError: Cannot specify both
disable_exllama
anduse_exllama
. Please use justuse_exllama
Expected behavior
No response
System Info
No response
Others
No response
The text was updated successfully, but these errors were encountered: