-
Notifications
You must be signed in to change notification settings - Fork 21.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[pipelining] Follow improvements in export.unflatten #126217
Conversation
[ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/126217
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit c75e69b with merge base 3759676 (): This comment was automatically generated by Dr. CI and updates every 15 minutes. |
ghstack-source-id: ae40df3b4575acbf2f35039db6891ed1478d0619 Pull Request resolved: #126217
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good!
assert node_module_stack == self.module_stack | ||
self.copy_node(node) | ||
node_idx += 1 | ||
from torch.export.unflatten import _ModuleFrame | ||
|
||
|
||
def _outline_submodules(orig_graph: torch.fx.Graph): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any value of this just getting moved to torch.export.unflatten as well? Or is this pretty PP specific?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah, this is the one func that's special.
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Previously, we make a copy of `torch.export.unflatten` in pippy/_unflatten.py. But it turns out to be too hard to track bug fixes and improvements in upstream version. For example, `torch.export.unflatten` recently added support for tied parameters, which is something pipelining needs. Now that we moved into pytorch, we make a reference to `torch.export.unflatten` instead of maintaining a copy. Pull Request resolved: pytorch#126217 Approved by: https://github.com/H-Huang
Previously, we make a copy of
torch.export.unflatten
in pippy/_unflatten.py.But it turns out to be too hard to track bug fixes and improvements in upstream version. For example,
torch.export.unflatten
recently added support for tied parameters, which is something pipelining needs.Now that we moved into pytorch, we make a reference to
torch.export.unflatten
instead of maintaining a copy.Stack from ghstack (oldest at bottom):
cc @mrshenli @pritamdamania87 @zhaojuanmao @satgera @gqchen @aazzolini @osalpekar @jiayisuse @H-Huang @awgu @penguinwu @fegin @XilunWu @wanchaol @fduwjj @wz337 @tianyu-l @wconstab @yf225 @chauhang @d4l3k