Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在 webui chat 上自定义基模型(和 LoRA adapter)的浮点数表示方式 #4410

Closed
1 task done
syGOAT opened this issue Jun 21, 2024 · 1 comment
Closed
1 task done
Labels
solved This problem has been already solved

Comments

@syGOAT
Copy link

syGOAT commented Jun 21, 2024

Reminder

  • I have read the README and searched the existing issues.

System Info

  • llamafactory version: 0.8.2.dev0
  • Platform: Linux-5.15.0-78-generic-x86_64-with-glibc2.31
  • Python version: 3.11.9
  • PyTorch version: 2.3.0+cu121 (GPU)
  • Transformers version: 4.41.2
  • Datasets version: 2.19.1
  • Accelerate version: 0.30.1
  • PEFT version: 0.11.1
  • TRL version: 0.8.6
  • GPU type: NVIDIA GeForce RTX 4090

Reproduction

启动 webui,选择 chat,输入模型路径和lora路径,载入模型。
image
在后台看到:
image
基模型自动用 bf16 表示。是否可以自定义其浮点数表示,因为我的 lora 是 fp16,基模型也为 fp16 或许能达到更好的输出效果。
另外,lora 的浮点数表示是自动判断的吗,如果不是,可否也设置为自定义?

Expected behavior

自定义基模型(和 LoRA adapter)的浮点数表示方式

Others

No response

@github-actions github-actions bot added the pending This problem is yet to be addressed label Jun 21, 2024
@hiyouga
Copy link
Owner

hiyouga commented Jun 24, 2024

fixed

@hiyouga hiyouga added solved This problem has been already solved and removed pending This problem is yet to be addressed labels Jun 24, 2024
PrimaLuz pushed a commit to PrimaLuz/LLaMA-Factory that referenced this issue Jul 1, 2024
xtchen96 pushed a commit to xtchen96/LLaMA-Factory that referenced this issue Jul 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

No branches or pull requests

2 participants