Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reapply torch.compile in Fabric.setup() #19280

Merged
merged 59 commits into from
Jan 24, 2024

Conversation

awaelchli
Copy link
Contributor

@awaelchli awaelchli commented Jan 13, 2024

What does this PR do?

Alternative implementation to #19192


Part of #17250

When you do

model = torch.compile(orig_model)
model = fabric.setup(model)

today, then you end up with

FabricModule(StrategyWrapper(OptimizedModule(orig_model)))

where StrategyWrapper represents the wrapper applied by the strategy, e.g., FSDP, DDP, DeepSpeedEngine, etc.

With the new feature enabled in this PR, you will now get:

FabricModule(OptimizedModule(StrategyWrapper(orig_model)))

Note how the optimized module gets applied outside the strategy wrapper, but inside the fabric module router.
It is important to have the fabric module outside the optimized module, because it contains only convenience routing and precision management that does not need to be compiled (nor can it be without graph breaks atm). At the same time, the FSDP/DDP specific operations can still be captured properly by torch compile, and how it is intended to be wrapped in PyTorch.

This feature is marked as experimental because we first need to test its application in the wild.

Follow-up work:

  • Verify it in the wild
  • Decide on the default
  • Document it in the compile-guide
  • Do the same in Lightning Trainer

📚 Documentation preview 📚: https://pytorch-lightning--19280.org.readthedocs.build/en/19280/

cc @Borda @carmocca @justusschock @awaelchli

awaelchli and others added 30 commits December 21, 2023 00:31
…g-AI/lightning into feature/rewrap-optimized-module
…g-AI/lightning into feature/rewrap-optimized-module
…g-AI/lightning into feature/rewrap-optimized-module
…g-AI/lightning into feature/rewrap-optimized-module
@github-actions github-actions bot added the docs Documentation related label Jan 14, 2024
@mergify mergify bot removed the has conflicts label Jan 15, 2024
src/lightning/fabric/wrappers.py Show resolved Hide resolved
src/lightning/fabric/wrappers.py Outdated Show resolved Hide resolved

This comment was marked as off-topic.

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Jan 23, 2024
@mergify mergify bot added the ready PRs ready to be merged label Jan 23, 2024
@awaelchli awaelchli merged commit 7cc79fe into master Jan 24, 2024
120 checks passed
@awaelchli awaelchli deleted the feature/rewrap-optimized-module2 branch January 24, 2024 02:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation related fabric lightning.fabric.Fabric feature Is an improvement or enhancement fun Staff contributions outside working hours - to differentiate from the "community" label pl Generic label for PyTorch Lightning package ready PRs ready to be merged torch.compile
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants