Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to incorporate loss function into my model? #45

Open
202041600047 opened this issue Mar 17, 2022 · 3 comments
Open

How to incorporate loss function into my model? #45

202041600047 opened this issue Mar 17, 2022 · 3 comments

Comments

@202041600047
Copy link

Thank you very much for your work, I have a question: my model is BiSeNetV2, I want to replace my loss function with your pixel contrast loss function, but my output has only one value, and the preds of the pixel contrast loss function is a dictionary, and the keys are 'seg' and 'embed', I don't know what 'seg' and 'embed' mean, and I don't know how to get these two values ​​in BiSeNetV2?

class ContrastCELoss(nn.Module, ABC):
def forward(self, preds, target, with_embed=False):
h, w = target.size(1), target.size(2)
assert "seg" in preds
assert "embed" in preds
seg = preds['seg']
embedding = preds['embed']

@tfzhou
Copy link
Owner

tfzhou commented Mar 18, 2022

@202041600047 Thanks for your interests. For the seg, it is the output of the model; while for the embed, you can obtain it through a convolutional projection head of your backbone model. Sorry for that I am not very familiar with the structure of BiSeNetV2, and cannot tell the specific layer that you can add the projection head.

@202041600047
Copy link
Author

@202041600047 Thanks for your interests. For the seg, it is the output of the model; while for the embed, you can obtain it through a convolutional projection head of your backbone model. Sorry for that I am not very familiar with the structure of BiSeNetV2, and cannot tell the specific layer that you can add the projection head.

Thank you very much for your busy schedule to answer my question. I have another question to ask you:
How to get embedding? The output of my model has only one value. I can put the output directly into self proj_ Head () method?
Replace
def forward(self, x_, with_embed=False, is_eval=False):
x = self.backbone(x_)
embedding = self.proj_head(x[-1])
with
def forward(self, x_, with_embed=False, is_eval=False):
x = self.backbone(x_) #x has only one value and self. Backbone is using my own network BiSeNet
embedding = self.proj_head(x)
Can I do this?That is to say, I don't know what the parameters in embedding should be?

@heifanfanfan
Copy link

Have you solved the problem of loading the new model? I have the same problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants