Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

questions about the weights of different losses #12

Open
icoz69 opened this issue Sep 21, 2017 · 2 comments
Open

questions about the weights of different losses #12

icoz69 opened this issue Sep 21, 2017 · 2 comments

Comments

@icoz69
Copy link

icoz69 commented Sep 21, 2017

in the demo 256 code, the weights of different losses are 1,1/1.6,1/2.3,1/2.8,10/0.5. where do these hyperparameters come from?
in the paper, it says they are "inverse of the number of elements in each layer', what do you mean by "number of elements", and how to calculate the weights above?
looking forward to ur reply, thank you

@CQFIO
Copy link
Owner

CQFIO commented Sep 23, 2017

If you just the weights listed there, it should be fine.

I use tf.reduce_mean to compute the average. The weights 1,1/1.6,1/2.3,1/2.8,10/0.5 are computed from the average l0,...,l5 in the 100 epoch.

@icoz69
Copy link
Author

icoz69 commented Oct 6, 2017

@CQFIO
thanks for your reply.
what do l0,...,l5 mean?
and how could the weights be learnable? losses are always positive, so weighting will keep decreasing during the training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants