Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove BertConfig inheritance from RobertaConfig #20124

Merged
merged 2 commits into from
Nov 9, 2022

Conversation

Saad135
Copy link
Contributor

@Saad135 Saad135 commented Nov 8, 2022

What does this PR do?

Removes BertConfig dependencies from RobertaConfig

Related to #19303

@sgugger can I please get some feedback on this. Thanks 😄

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Nov 8, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice, thanks a lot!


Args:
vocab_size (`int`, *optional*, defaults to 30522):
Vocabulary size of the BERT model. Defines the number of different tokens that can be represented by the
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Vocabulary size of the BERT model. Defines the number of different tokens that can be represented by the
Vocabulary size of the RoBERTa model. Defines the number of different tokens that can be represented by the

@sgugger
Copy link
Collaborator

sgugger commented Nov 9, 2022

Thanks again for your contribution!

@sgugger sgugger merged commit 0946ed9 into huggingface:main Nov 9, 2022
@Saad135 Saad135 deleted the roberta-config-update branch November 16, 2022 13:04
mpierrau pushed a commit to mpierrau/transformers that referenced this pull request Dec 15, 2022
* Remove BertConfig inheritance from RobertaConfig

* Fix Typo: BERT to RoBERTa
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants