-
Notifications
You must be signed in to change notification settings - Fork 61
Conversation
|
||
option go_package = "github.com/lyft/flyteidl/gen/pb-go/flyteidl/plugins"; | ||
|
||
message PyTorchOperatorTask { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add some code comments here...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, done.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should call the plugin type, DistributedPytorchTraining?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also the comment then should be, this plugin enables distributed training for pytorch using "link to the Kubeflow operator" - What do you think @igorvalko ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
TL;DR
proto files and generated ones for pytorch Flyte plugin
Type
Are all requirements met?
Complete description
That's the part of pytorch Flyte plugin implementation.
Related PRs:
flyteorg/flytekit#112
flyteorg/flyteplugins#85
Tracking Issue
N/A
Follow-up issue
N/A