Skip to content

PyTorch implementation of shake-drop regularization

License

Notifications You must be signed in to change notification settings

ShaohuiLin/shake-drop_pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Shake Drop regularization

PyTorch implementation of shake-drop regularization. Author Torch implementations is here.

Dependencies

  • python 3.5+
  • PyTorch 1.0.0

Accuracy

CIFAR-100

Model Method Level Alpha Beta This implementaion Paper
PyramidNet ShakeDrop Batch [-1, 1] [0, 1] 83.90 83.78

CIFAR-100

Train PyramidNet(depth=110, alpha=270) with shake-drop for CIFAR-100

python train.py --epochs 300 --batch_size 128 --label 100 --lr 0.5 --depth 110 --alpha 270 --snapshot_interval 10

References

Yoshihiro Yamada, Masakazu Iwamura, Koichi Kise. "ShakeDrop regularization" ICLR2018 UnderReview

Yoshihiro Yamada, Masakazu Iwamura, Takuya Akiba, Koichi Kise. " ShakeDrop Regularization for Deep Residual Learning" arXiv:1802.02375v2

About

PyTorch implementation of shake-drop regularization

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%