
几种新的regularization方法:
- Shake-Shake
- Cutout
- mixup
- pairing samples
- ShakeDrop
- AutoAugment
code: https://github.com/xgastaldi/shake-shake
Gastaldi X. Shake-shake regularization[J]. arXiv preprint arXiv:1705.07485, 2017.
The idea is to replace, in a multi-branch network, the standard summation of parallel branches with a stochastic affine combination.
Cutout
code: https://github.com/uoguelph-mlrg/Cutout
DeVries T, Taylor G W. Improved regularization of convolutional neural networks with cutout[J]. arXiv preprint arXiv:1708.04552, 2017.
Randomly masking out square regions of input during training

mixup
Zhang H, Cisse M, Dauphin Y N, et al. mixup: Beyond empirical risk minimization[J]. arXiv preprint arXiv:1710.09412, 2017.
pairing samples
Inoue H. Data augmentation by pairing samples for images classification[J]. arXiv preprint arXiv:1801.02929, 2018.

ShakeDrop
code: https://github.com/imenurok/ShakeDrop
To realize a similar regularization to Shake-Shake on 1-branch network architectures
Yamada Y, Iwamura M, Kise K. ShakeDrop regularization[J]. arXiv preprint arXiv:1802.02375, 2018.



AutoAugment
code: https://github.com/DeepVoltaire/AutoAugment
Cubuk E D, Zoph B, Mane D, et al. AutoAugment: Learning Augmentation Policies from Data[J]. arXiv preprint arXiv:1805.09501, 2018.










近期评论