-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: add models and md5 #783
Conversation
Codecov Report
@@ Coverage Diff @@
## main #783 +/- ##
==========================================
- Coverage 86.64% 83.30% -3.34%
==========================================
Files 21 21
Lines 1108 1108
==========================================
- Hits 960 923 -37
- Misses 148 185 +37
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
@@ -24,7 +24,7 @@ def __init__(self, name: str, device: str = 'cpu', jit: bool = False, **kwargs): | |||
if model_url: | |||
model_path = download_model(model_url, md5sum=md5sum) | |||
self._model = load_openai_model(model_path, device=device, jit=jit) | |||
self._model_name = name | |||
self._model_name = name.split('::')[0] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@numb3r3 Check this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@numb3r3 Check this
should be split[0], I had fixed it in my pr
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the problem caused here can be addressed by another PR. So let's revert this change in this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we need this to pass ci
@@ -24,7 +24,7 @@ def __init__(self, name: str, device: str = 'cpu', jit: bool = False, **kwargs): | |||
if model_url: | |||
model_path = download_model(model_url, md5sum=md5sum) | |||
self._model = load_openai_model(model_path, device=device, jit=jit) | |||
self._model_name = name | |||
self._model_name = name.split('::')[0] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the problem caused here can be addressed by another PR. So let's revert this change in this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
This pr adds open-clip model filenames and their corresponding md5
We now host a copy of open-clip models and convert them to onnx runtime