Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncouple multiprocessing settings #240

Open
ejm714 opened this issue Sep 28, 2022 · 1 comment
Open

Uncouple multiprocessing settings #240

ejm714 opened this issue Sep 28, 2022 · 1 comment
Labels
enhancement New feature or request good first issue Good for newcomers

Comments

@ejm714
Copy link
Collaborator

ejm714 commented Sep 28, 2022

Right now, we set the multiprocessing_context for the Trainer based on the num_workers used for the data loader

https://github.com/drivendataorg/zamba/blob/master/zamba/pytorch_lightning/utils.py#L67-L71

https://github.com/drivendataorg/zamba/blob/master/zamba/models/model_manager.py#L283-L286

It would be good to separate those out for a couple reasons:

  • it lets us use multiple cores for data loading but not need to set a multiprocessing strategy for the trainer when only running on a single GPU
  • we've only trained models on a single GPU so it's not clear that multiprocessing for the model is fully and properly configured
  • pytorch lightning is making a lot of changes currently to their accelerators and strategies used for distributed training, so it would be nice to let those settle a bit before supporting multi GPU training in zamba

Implementation thoughts:

  • do not infer multiprocessing context from num workers (only use num workers for the dataloaders and to determine persistent_workers)
  • consider adding a multiprocessing strategy on the train config object with the PTL default. another option is to set this as a boolean and let zamba determine the best strategy / accelerator combo
@ejm714 ejm714 added the enhancement New feature or request label Sep 28, 2022
@klwetstone klwetstone added the good first issue Good for newcomers label Apr 17, 2024
@aaronphilip19
Copy link

Hey, @sambujangfofana and I are students from the University of Michigan. We are currently working on a project wherein we have to contribute to a Github repository(https://eecs481.org/hw6.html). We are pretty interested in this issue and would want to work on it. We hope to submit a pull request this week. Could we be assigned this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants