Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to run the fine-tuning using slurm? #16

Open
AlessioQuercia opened this issue Aug 30, 2024 · 1 comment
Open

How to run the fine-tuning using slurm? #16

AlessioQuercia opened this issue Aug 30, 2024 · 1 comment

Comments

@AlessioQuercia
Copy link

AlessioQuercia commented Aug 30, 2024

Could you provide a slurm script to run the fine-tuning code? Apparently there are some issues with deepspeed, by just using the provided instructions.

@AlessioQuercia AlessioQuercia changed the title How to run the fine-tuning on slurm with deepspeed? How to run the fine-tuning using slurm? Aug 30, 2024
@kongds
Copy link
Owner

kongds commented Sep 1, 2024

Thank you for your interest in our work.

Our experiments are directly conducted with deepspeed over multiple nodes by the provided script (without slurm).
Maybe you need some configs to make slurm run on multiple nodes. (Or you can run script on single node by change --num_nodes=4 to --num_nodes=1)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants