Skip to content

Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch

License

Notifications You must be signed in to change notification settings

zhaoqyu/wide-resnet.pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch

Pytorch Implementation of Sergey Zagoruyko's Wide Residual Networks

For Torch implementations, see here.

Requirements

See the installation instruction for a step-by-step installation guide. See the server instruction for server settup.

pip install http://download.pytorch.org/whl/cu80/torch-0.1.12.post2-cp27-none-linux_x86_64.whl
pip install torchvision
git clone https://github.com/meliketoy/wide-resnet.pytorch

How to run

After you have cloned the repository, you can train each dataset of either cifar10, cifar100 by running the script below.

python main --lr 0.1 resume false --net_type [lenet/vggnet/resnet/wide-resnet] --depth 28 --widen_factor 10 --dropout_rate 0.3 --dataset [cifar10/cifar100] 

Implementation Details

epoch learning rate weight decay Optimizer Momentum Nesterov
0 ~ 60 0.1 0.0005 Momentum 0.9 true
61 ~ 120 0.02 0.0005 Momentum 0.9 true
121 ~ 160 0.004 0.0005 Momentum 0.9 true
161 ~ 200 0.0008 0.0005 Momentum 0.9 true

CIFAR-10 Results

alt tag

Below is the result of the test set accuracy for CIFAR-10 dataset training.

Accuracy is the average of 5 runs

network dropout preprocess GPU:0 GPU:1 per epoch accuracy(%)
wide-resnet 28x10 0 ZCA 5.90G - 2 min 03 sec 95.83
wide-resnet 28x10 0 meanstd 5.90G - 2 min 03 sec 96.21
wide-resnet 28x10 0.3 meanstd 5.90G - 2 min 03 sec 96.27
wide-resnet 28x20 0.3 meanstd 8.13G 6.93G 4 min 10 sec 96.55
wide-resnet 40x10 0.3 meanstd 8.08G - 3 min 13 sec 96.31
wide-resnet 40x14 0.3 meanstd 7.37G 6.46G 3 min 23 sec 96.34

CIFAR-100 Results

alt tag

Below is the result of the test set accuracy for CIFAR-100 dataset training.

Accuracy is the average of 5 runs

network dropout preprocess GPU:0 GPU:1 per epoch Top1 acc(%) Top5 acc(%)
wide-resnet 28x10 0 ZCA 5.90G - 2 min 03 sec 80.07 95.02
wide-resnet 28x10 0 meanstd 5.90G - 2 min 03 sec 81.02 95.41
wide-resnet 28x10 0.3 meanstd 5.90G - 2 min 03 sec 81.49 95.62
wide-resnet 28x20 0.3 meanstd 8.13G 6.93G 4 min 05 sec 82.45 96.11
wide-resnet 40x10 0.3 meanstd 8.93G - 3 min 06 sec 81.42 95.63
wide-resnet 40x14 0.3 meanstd 7.39G 6.46G 3 min 23 sec 81.87 95.51

About

Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 94.9%
  • Shell 3.5%
  • Lua 1.6%