Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SwinUNETR for TensorRT #5893

Open
tangy5 opened this issue Jan 24, 2023 · 5 comments
Open

SwinUNETR for TensorRT #5893

tangy5 opened this issue Jan 24, 2023 · 5 comments
Assignees
Labels
enhancement New feature or request

Comments

@tangy5
Copy link
Contributor

tangy5 commented Jan 24, 2023

Current SwinUNETR is not scriptable by torch.jit.script, several features has some limitations to make SwinUNETR compatible with TensorRT.

This issue can discuss the potential solution for making SwinUENTR support TensorRT.

A potential solution is to make a 'light' version of this network, so we won't significantly impact users with a transition.

  • checkpoint option can be removed, as inference pipleline is mostly used and do not require checkpointing.
  • some packages such as itertools, einops needs to replaced.
  • Functions not scriptable such as Slice() need to be rewrite.

Forward functions that contains none basic python can be modified to support torch.jit.script.
A light version of SwinUNETR can be a subclass of current SwinUENTR network, when users uses SwinUNETR from bundle, it uses the light version which works with TensorRT.
The light version SwinUNETR need to be tested and evaluated for comparable performance to the current version.

Welcome comment on other options. There can be a better choice.

@Nic-Ma Nic-Ma added the enhancement New feature or request label Jan 25, 2023
@tangy5 tangy5 moved this to Done in MONAI v1.2 Apr 5, 2023
@csheaff
Copy link

csheaff commented Apr 5, 2023

Hi, glad to see this topic in discussion. My understanding from #5125 is that there is some work around using torch.jit.trace or torch.onnx.export. Is this not the case?

@binliunls
Copy link
Contributor

Hi, glad to see this topic in discussion. My understanding from #5125 is that there is some work around using torch.jit.trace or torch.onnx.export. Is this not the case?

Hi @csheaff,
Currently, the swin_unetr model in the model-zoo is already supported by the torch.jit.trace conversion way. This ticket is for converting the swin_unetr to a TensorRT engine-based torchscript, which is not supported for now. We are working with the TensorRT team on it and hopefully will support it in the future version.

Thanks,
Bin

@csheaff
Copy link

csheaff commented Nov 8, 2024

Is this impacted by #7937 ?

@KumoLiu
Copy link
Contributor

KumoLiu commented Nov 11, 2024

Hi @binliunls, does the swin_unetr model now support TensorRT conversion using the new trt_compile API?
https://github.com/Project-MONAI/model-zoo/blob/dev/models/swin_unetr_btcv_segmentation/configs/inference_trt.json#L8

@binliunls
Copy link
Contributor

Hi @binliunls, does the swin_unetr model now support TensorRT conversion using the new trt_compile API? https://github.com/Project-MONAI/model-zoo/blob/dev/models/swin_unetr_btcv_segmentation/configs/inference_trt.json#L8

Yes it does.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: No status
Development

No branches or pull requests

5 participants