We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
llamafactory
启动 webui,选择 chat,输入模型路径和lora路径,载入模型。 在后台看到: 基模型自动用 bf16 表示。是否可以自定义其浮点数表示,因为我的 lora 是 fp16,基模型也为 fp16 或许能达到更好的输出效果。 另外,lora 的浮点数表示是自动判断的吗,如果不是,可否也设置为自定义?
自定义基模型(和 LoRA adapter)的浮点数表示方式
No response
The text was updated successfully, but these errors were encountered:
fca893d
fixed
Sorry, something went wrong.
fix hiyouga#4410
7e9f2b3
f5dcb3a
No branches or pull requests
Reminder
System Info
llamafactory
version: 0.8.2.dev0Reproduction
启动 webui,选择 chat,输入模型路径和lora路径,载入模型。
在后台看到:
基模型自动用 bf16 表示。是否可以自定义其浮点数表示,因为我的 lora 是 fp16,基模型也为 fp16 或许能达到更好的输出效果。
另外,lora 的浮点数表示是自动判断的吗,如果不是,可否也设置为自定义?
Expected behavior
自定义基模型(和 LoRA adapter)的浮点数表示方式
Others
No response
The text was updated successfully, but these errors were encountered: