-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: 'str' object has no attribute 'template'. Did you mean: 'replace'? #5366
Closed
1 task done
Labels
solved
This problem has been already solved
Comments
Oops, forgot the stacktrace.
|
I think this is a bug. I fixed this by adding line "data_args = DataArguments(template=data_args)" in get_template_and_fix_tokenizer function |
hiyouga
added
solved
This problem has been already solved
and removed
pending
This problem is yet to be addressed
labels
Sep 5, 2024
fixed |
yuwangnexusera
pushed a commit
to yuwangnexusera/LLaMA-Factory
that referenced
this issue
Sep 6, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Reminder
System Info
Latest branch. It is affected by this commit:
dabad55#diff-bddef399f9575dd690ff2e4a91ae1bcfba5614eaaac63f139fe1ed2d988ad90cR348
Reproduction
model
model_name_or_path: meta-llama/Meta-Llama-3.1-8B-Instruct/
adapter_name_or_path: saves/distill/sft
template: llama3
finetuning_type: lora
export
export_dir: models/distill
export_size: 2
export_device: cuda
export_legacy_format: false
Expected behavior
It should correctly export the model using the template.
Others
No response
The text was updated successfully, but these errors were encountered: