Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add fp16 inference support (torch/onnx) #871

Merged
merged 40 commits into from
Dec 8, 2022

Conversation

OrangeSodahub
Copy link
Contributor

@OrangeSodahub OrangeSodahub commented Dec 4, 2022

  • clip_torch
  • clip_onnx

@github-actions github-actions bot added size/s and removed size/xs labels Dec 4, 2022
@codecov
Copy link

codecov bot commented Dec 4, 2022

Codecov Report

Merging #871 (bd1fe7c) into main (fd16e5a) will decrease coverage by 9.68%.
The diff coverage is 93.47%.

@@            Coverage Diff             @@
##             main     #871      +/-   ##
==========================================
- Coverage   81.04%   71.35%   -9.69%     
==========================================
  Files          22       22              
  Lines        1498     1529      +31     
==========================================
- Hits         1214     1091     -123     
- Misses        284      438     +154     
Flag Coverage Δ
cas 71.35% <93.47%> (-9.69%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
server/clip_server/model/model.py 75.57% <81.81%> (+0.47%) ⬆️
server/clip_server/executors/clip_torch.py 87.77% <90.00%> (+0.87%) ⬆️
server/clip_server/executors/clip_onnx.py 87.91% <100.00%> (+2.68%) ⬆️
server/clip_server/executors/helper.py 97.18% <100.00%> (+0.12%) ⬆️
server/clip_server/helper.py 46.66% <100.00%> (+3.80%) ⬆️
server/clip_server/model/clip_onnx.py 76.19% <100.00%> (+3.46%) ⬆️
server/clip_server/executors/clip_tensorrt.py 0.00% <0.00%> (-94.60%) ⬇️
server/clip_server/model/clip_trt.py 0.00% <0.00%> (-69.39%) ⬇️
server/clip_server/model/trt_utils.py 0.00% <0.00%> (-56.05%) ⬇️
... and 1 more

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@OrangeSodahub OrangeSodahub force-pushed the add-fp16-inference-support branch from d542df4 to 43bf259 Compare December 6, 2022 05:01
@jemmyshin jemmyshin assigned jemmyshin and unassigned jemmyshin Dec 6, 2022
@OrangeSodahub OrangeSodahub requested a review from ZiniuYu December 7, 2022 06:01
@OrangeSodahub OrangeSodahub changed the title feat: add fp16 inference support feat: add fp16 inference support (torch) Dec 8, 2022
@OrangeSodahub OrangeSodahub changed the title feat: add fp16 inference support (torch) feat: add fp16 inference support (torch/onnx) Dec 8, 2022
tests/test_fp16.py Outdated Show resolved Hide resolved
@OrangeSodahub OrangeSodahub marked this pull request as ready for review December 8, 2022 08:12
Copy link
Contributor

@jemmyshin jemmyshin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@jemmyshin jemmyshin merged commit 1fe3a5a into main Dec 8, 2022
@jemmyshin jemmyshin deleted the add-fp16-inference-support branch December 8, 2022 08:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants