-
Notifications
You must be signed in to change notification settings - Fork 709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🔨 Pass pre-trained
from config to ModelLightning
#529
Conversation
pre-trained
from config to ModelLightning
pre-trained
from config to ModelLightning
pre-trained
from config to ModelLightning
pre-trained
from config to ModelLightning
pre-trained
from config to ModelLightning
pre-trained
from config to ModelLightning
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm having some doubts about adding this parameter. For feature extraction based models such as padim and dfm, what would be the use case of having a non pre-trained backbone? Since the training does not involve finetuning the backbone weights, the normality model would be built upon random layer activations.
I don't know if there is a specific use-case for this. config.yaml file contains pre_trained flag. If we dont want this, maybe we could remove this from the config file itself. Perhaps you could ask this question in PR 514 to confirm whether this actually has any use-case? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Based on the discussion in the other PR (#514) I think this is a viable use case, so I'm happy to merge this one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Description
pre_trained
parameter toModelLightning
implementation.Changes
Checklist