You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found the LOSS_CONSISTENCY in your code, however it is not appeared in your paper. I have run the code on my own dataset, but it seems that the LOSS_CONSISTENCY is not stable. Does the LOSS_CONSISTENCY mean to minimize the difference the joint distribution and the marginal distributions of the multiple variational encoders?
The text was updated successfully, but these errors were encountered:
Hello,
LOSS_CONSISTENCY was first intended to improve the stability of the learning by encouraging the network to share information across different views. (This is achieved by constraining the joint and the marginal distributions similar). However, it was not helpful in the current form, and thus, not used for experiments in the paper.
Hello,
LOSS_CONSISTENCY was first intended to improve the stability of the learning by encouraging the network to share information across different views. (This is achieved by constraining the joint and the marginal distributions similar). However, it was not helpful in the current form, and thus, not used for experiments in the paper.
I found the LOSS_CONSISTENCY in your code, however it is not appeared in your paper. I have run the code on my own dataset, but it seems that the LOSS_CONSISTENCY is not stable. Does the LOSS_CONSISTENCY mean to minimize the difference the joint distribution and the marginal distributions of the multiple variational encoders?
The text was updated successfully, but these errors were encountered: