- Update (2019.10): Start training with symmetric cross entropy (SCE) loss (replacing cross entropy).
The Symmetric Cross Entropy (SCE) was demonstrated can improve several exisiting methods including the D2L: ICCV2019 "Symmetric Cross Entropy for Robust Learning with Noisy Labels" https://arxiv.org/abs/1908.06112 https://github.com/YisenWang/symmetric_cross_entropy_for_noisy_labels
- Update (2020.03): convergence issue on CIFAR-100 when using SCE loss: learning rate, data augmentation and parameters for SCE.
An example:
python train_model.py -d mnist -m d2l -e 50 -b 128 -r 40
-d
: dataset in ['mnist', 'svhn', 'cifar-10', 'cifar-100']
-m
: model in ['ce', 'forward', 'backward', 'boot_hard', 'boot_soft', 'd2l']
-e
: epoch, -b
: batch size, -r
: noise rate in [0, 100]
# mnist example
args = parser.parse_args(['-d', 'mnist', '-m', 'd2l',
'-e', '50', '-b', '128',
'-r', '40'])
main(args)
# svhn example
args = parser.parse_args(['-d', 'svhn', '-m', 'd2l',
'-e', '50', '-b', '128',
'-r', '40'])
main(args)
# cifar-10 example
args = parser.parse_args(['-d', 'cifar-10', '-m', 'd2l',
'-e', '120', '-b', '128',
'-r', '40'])
main(args)
# cifar-100 example
args = parser.parse_args(['-d', 'cifar-100', '-m', 'd2l',
'-e', '200', '-b', '128',
'-r', '40'])
main(args)
tensorflow, Keras, numpy, scipy, sklearn, matplotlib