Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Onyx + TensorRT support for XLM-R Large Vit-B/16+? #822

Closed
NOT-HAL9000 opened this issue Sep 15, 2022 · 5 comments · Fixed by #828
Closed

Onyx + TensorRT support for XLM-R Large Vit-B/16+? #822

NOT-HAL9000 opened this issue Sep 15, 2022 · 5 comments · Fixed by #828

Comments

@NOT-HAL9000
Copy link

Just wondering if this is possible? Thanks!

@ZiniuYu
Copy link
Member

ZiniuYu commented Sep 15, 2022

Hi, thank you for asking!

For ONNX it is possible, and it will be supported in future releases. Unfortunately, we have not yet planned to support TensorRT for M-CLIP models very soon. You may refer to other online resources on converting PyTorch models to TensorRT.

@NOT-HAL9000
Copy link
Author

For ONNX it is possible, and it will be supported in future releases

Thanks! Is there a roadmap with a rough ETA for the ONYX runtime for M-CLIP models?

@numb3r3
Copy link
Member

numb3r3 commented Sep 19, 2022

It's hard to offer the exact timeline for supporting ONNX M-CLIP models. However, It's hopefully we can release it by end of this month.

@ZiniuYu ZiniuYu linked a pull request Sep 23, 2022 that will close this issue
@numb3r3
Copy link
Member

numb3r3 commented Sep 27, 2022

@NOT-HAL9000 I'm glad to say now, ONNX M-CLIP models have been enabled in the latest version. Please play it around, and leave your comments.

@numb3r3 numb3r3 reopened this Sep 27, 2022
@NOT-HAL9000
Copy link
Author

@numb3r3 Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants