Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(transformers): pretrained protocol support #3684

Merged
merged 19 commits into from
Apr 4, 2023

Conversation

aarnphm
Copy link
Contributor

@aarnphm aarnphm commented Mar 19, 2023

This PR handles a few things:

  • add support for saving any pre-trained protocol with save_model
  • Fixes a canonical bug where the pipeline is saved twice (this happens with custom pipeline)

I have added backward compatibility layer for save v1, but still need more testing to make sure this won't break compat.

@aarnphm aarnphm requested a review from a team as a code owner March 19, 2023 15:59
@aarnphm aarnphm requested review from jjmachan, ssheng and parano and removed request for a team March 19, 2023 15:59
@aarnphm aarnphm changed the title feat: PreTrained supports feat(transformers): PreTrained supports Mar 19, 2023
@codecov
Copy link

codecov bot commented Mar 19, 2023

Codecov Report

Merging #3684 (3b1f25f) into main (8211e5f) will not change coverage.
The diff coverage is 0.00%.

Impacted file tree graph

@@          Coverage Diff           @@
##            main   #3684    +/-   ##
======================================
  Coverage   0.00%   0.00%            
======================================
  Files        148     148            
  Lines      12137   12302   +165     
======================================
- Misses     12137   12302   +165     
Impacted Files Coverage Δ
src/bentoml/_internal/frameworks/transformers.py 0.00% <0.00%> (ø)
src/bentoml/transformers.py 0.00% <0.00%> (ø)

@aarnphm aarnphm force-pushed the feat/support-models-api branch 2 times, most recently from e547443 to f05edb6 Compare March 19, 2023 16:30
@aarnphm aarnphm changed the title feat(transformers): PreTrained supports feat(transformers): pretrained protocol support Mar 21, 2023
@parano parano requested review from bojiang and removed request for jjmachan March 22, 2023 20:46
@bojiang
Copy link
Member

bojiang commented Mar 23, 2023

@aarnphm Hey Aaron, is the changes mentioned in the design Docs ready for review or just still WIP?

@aarnphm
Copy link
Contributor Author

aarnphm commented Mar 23, 2023

@bojiang This is ready for review. Addressed all of the design doc. cc @parano @ssheng

@aarnphm
Copy link
Contributor Author

aarnphm commented Mar 23, 2023

Tho Idk why coverage is failing for some reason.

@bojiang
Copy link
Member

bojiang commented Mar 23, 2023

Couldn't parse '/Users/runner/work/BentoML/BentoML/src/bentoml/_internal/frameworks/transformers.py' as Python source: 'invalid syntax' at line 1

@aarnphm I think the unit test did mean something wrong with Python 3.7

@aarnphm aarnphm merged commit 4ea49df into bentoml:main Apr 4, 2023
@aarnphm aarnphm deleted the feat/support-models-api branch April 4, 2023 22:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants