-
Notifications
You must be signed in to change notification settings - Fork 7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More Multiweight support cleanups #4948
More Multiweight support cleanups #4948
Conversation
💊 CI failures summary and remediationsAs of commit 8aa1c23 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
3bf6919
to
0eab8e7
Compare
0eab8e7
to
1172e1f
Compare
fc7a23b
to
9ff811b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some notes for a potential refactoring proposed by @pmeier to use decorators instead of having the deprecation code within the model builders.
if "pretrained" in kwargs: | ||
warnings.warn("The parameter pretrained is deprecated, please use weights instead.") | ||
weights = AlexNetWeights.ImageNet1K_RefV1 if kwargs.pop("pretrained") else None | ||
weights = _deprecated_param(kwargs, "pretrained", "weights", AlexNetWeights.ImageNet1K_RefV1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pmeier Case 1: Classification - lines 32-36
- The first argument might be positional (aka
alexnet(True)
), so we handle it for BC - We replace pretrained=True with the default value for the model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could add an additional parameter to the decorator named deprecated_param_idx
and also check *args
if it is passed.
On a side note, this is one of the reasons I aggressively prefer keyword-only arguments for every parameter that doesn't have an obvious meaning. pretrained
is not such a case since I would have no idea what alexnet(True)
does without reading documentation.
I understand we want to keep BC here, but maybe we can have a deprecation period and change to keyword argument only afterwards? I can write a decorator for that if you like.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes agreed, everyone is in agreement here that we should prefer keyword arguments and we should deprecate the old usage. We need to coordinate this to avoid pushing too many changes concurrently but I'm fully supportive of doing this.
weights_backbone = ResNet50Weights.ImageNet1K_RefV1 if kwargs.pop("pretrained_backbone") else None | ||
weights_backbone = _deprecated_param( | ||
kwargs, "pretrained_backbone", "weights_backbone", ResNet50Weights.ImageNet1K_RefV1 | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pmeier Case 2: Detection/Segmentation model - lines 83-94
- Receives two parameters one for whole network and one for the backbone.
- Both use similar logic as previously.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Receives two parameters one for whole network and one for the backbone.
We can apply the decorator twice, no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, you should be able to apply it twice ideally.
default_value = KeypointRCNNResNet50FPNWeights.Coco_RefV1 | ||
if kwargs["pretrained"] == "legacy": | ||
default_value = KeypointRCNNResNet50FPNWeights.Coco_RefV1_Legacy | ||
kwargs["pretrained"] = True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pmeier Special case for Case 2:
- The use of
"legacy"
appears only on this model. - Everything else remains as is with the other detection / segmentation models.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that we have special case here anyway, I would prefer to write another small decorator that performs this conversion and will simply be applied before the deprecation one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure any workaround would do. This happens only here, nowhere else.
default_value = ( | ||
QuantizedGoogLeNetWeights.ImageNet1K_FBGEMM_TFV1 if quantize else GoogLeNetWeights.ImageNet1K_TFV1 | ||
) | ||
weights = _deprecated_param(kwargs, "pretrained", "weights", default_value) # type: ignore[assignment] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pmeier Case 3: Quantized models - lines 50-59
- The user can pass weights that are quantized or normal, hence the union here.
- The default value is conditional on the
quantize
parameter but other than that, everything remains the same. - The
ignore[assignment]
is due to limitations of mypy that doesn't understand thatOptional[Union[XWeights, YWeights]]
is subtype ofOptional[Weights]
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I went over all the commits individually and it LGTM, thanks!
Summary: * Updated the link for densenet recipe. * Set default value of `num_classes` and `num_keypoints` to `None` * Provide helper methods for parameter checks to reduce duplicate code. * Throw errors on silent config overwrites from weight meta-data and legacy builders. * Changing order of arguments + fixing mypy. * Make the builders fully BC. * Add "default" weights support that returns always the best weights. Reviewed By: NicolasHug Differential Revision: D32694303 fbshipit-source-id: c6b5f51209248fe6e5b3642fa4d08c388c5cb421
Fixes partially #4652
Addresses some of the comments at #4937. Each commit in this PR addresses a single of the comments below, so it can be seen as a stacked diff:
num_classes
andnum_keypoints
toNone
- NOMRG Comments on prototype weights and model builders #4937 (comment), NOMRG Comments on prototype weights and model builders #4937 (comment)cc @datumbox @bjuncek