-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make ViT and Unetr to be torchscript comaptible #7937
Conversation
Signed-off-by: YunLiu <[email protected]>
Signed-off-by: YunLiu <[email protected]>
Signed-off-by: YunLiu <[email protected]>
for more information, see https://pre-commit.ci
/build |
Hi @ericspod and @virginiafdez, the issue is introduced by merging the gen-ai-dev branch: #7886. ViT and Unetr can't support torchscript converting after merging the PR. |
Signed-off-by: YunLiu <[email protected]>
Signed-off-by: YunLiu <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have checked this with Eric. The only concern we have is in selfattention.py, when you add the CrossAttention block as an attribute by default, it might lead to incompatibilities when loading the module weights from old models, as weights are expected to be there unless you choose strict=False
/build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's put this through now and we'll come back to the issue of weights later, if we're lucky no one's saved models are broken by this anyway.
Yes, it will potential be an issue, may need to add a |
Fixes #7936
Description
self.causal_mask = torch.Tensor()
before register bufferTypes of changes
./runtests.sh -f -u --net --coverage
../runtests.sh --quick --unittests --disttests
.make html
command in thedocs/
folder.