Skip to content

LeungSamWai/Drop-Activation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Drop-Activation: Implicit Parameter Reduction and Harmonious Regularization

GitHub 996.ICU

The repository is the official implementation of paper "Drop-Activation: Implicit Parameter Reduction and Harmonious Regularization" [paper].

By Senwei Liang, Yuehaw Kwoo and Haizhao Yang.

Introduction

Drop-Activation is a regularization method to reduce the risk of overfitting. The key idea is to drop nonlinear activation functions by setting them to be identity functions randomly during training time. During testing, we use a deterministic network with a new activation function to encode the average effect of dropping activations randomly.

image-w20

Requirement

Usage

  • Clone the repository of Drop-Activation
git clone https://github.com/LeungSamWai/Drop-Activation.git
  • Train the WideResNet28-10 using Drop Activation on CIFAR100
python cifar.py -a da_wrn --dataset cifar100 --depth 28 --depth 28 --widen-factor 10 --drop 0.3 --epochs 200 --schedule 60 120 160 --wd 5e-4 --gamma 0.2 --checkpoint checkpoints/cifar100/WRN-28-10-drop-DropActivation

Citing

  • If you find it useful in your research, please cite
@article{Liang2018Drop,
         title={Drop-Activation: Implicit Parameter Reduction and Harmonic Regularization},
         author={Liang, Senwei and Kwoo, Yuehaw and Yang, Haizhao},
         year={2018},}

Acknowledgements

We thank bearpaw for his well-organized pytorch framework for image classification task.

About

The official implementation of paper "Drop-Activation: Implicit Parameter Reduction and Harmonious Regularization".

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages