We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PTV2将优化器设置为adam会报AttributeError: module 'torch.optim' has no attribute 'adam'的错误,adamw是好的。 Lora则没有该问题。
The text was updated successfully, but these errors were encountered:
配置如下:
enable_deepspeed = True enable_ptv2 = True enable_lora = False enable_int8 = False # qlora int8 enable_int4 = False # qlora int4
Sorry, something went wrong.
pip list |grep torch
pytorch-lightning 2.0.4 torch 2.0.1 torchaudio 2.0.2 torchmetrics 1.0.0 torchvision 0.15.2
pip uninstall deep_training pip install -U git+https://github.com/ssbuild/deep_training.git
it is bug and fixed , do upgrade can solve.
Thanks a lot. I'll try it.
No branches or pull requests
PTV2将优化器设置为adam会报AttributeError: module 'torch.optim' has no attribute 'adam'的错误,adamw是好的。
Lora则没有该问题。
The text was updated successfully, but these errors were encountered: