-
Notifications
You must be signed in to change notification settings - Fork 636
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dropout uses the *memory address* of seeds instead of reading seeds from memory #173
Comments
oh just seeing that, sorry for the delay ! |
10 tasks
should be fixed, and sanity checking the speed it's not really changed |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
🐛 Bug
From reading the code for
k_dropout_fw
andk_dropout_bw
, it seems to me that the seeds are never read from memory and the code simply uses the memory address of the seed.For example:
Here seed is still a memory address and not an integer.
As a result, when
k_dropout_fw
is passed in two identical seed tensors with different memory addresses, it produces different results.To Reproduce
Setting the Pytorch seed should produce the same
seed
used in dropout, and should produce the same dropout mask.However, that's not the case
conda
,pip
, source): condaThe text was updated successfully, but these errors were encountered: