You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When MetaBatchNormLayer is called forward with backup_running_statistics=True, the running statistics are meant to be copied into the backup variables by copying:
However, this is not what happens and the underlying data ends up being tied with each other such that when the running statistics get updated, so does the backup. Here is a short code snippet that minimally reproduces the same behavior:
@denizetkar is right, that's not your intended behaviour and the backup is overwritten by the validation pass. You simply need to use deepcopy instead.
When
MetaBatchNormLayer
is called forward withbackup_running_statistics=True
, the running statistics are meant to be copied into the backup variables by copying:HowToTrainYourMAMLPytorch/meta_neural_network_architectures.py
Line 241 in 8696470
However, this is not what happens and the underlying data ends up being tied with each other such that when the running statistics get updated, so does the backup. Here is a short code snippet that minimally reproduces the same behavior:
Is my understanding of backing up the running statistics wrong or is this a bug that needs fixing?
The text was updated successfully, but these errors were encountered: