Skip to content

Instantly share code, notes, and snippets.

@koshian2
Created October 2, 2018 06:09
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save koshian2/ae7bdbcfdbd93800bff52cd3107ce6f0 to your computer and use it in GitHub Desktop.
Save koshian2/ae7bdbcfdbd93800bff52cd3107ce6f0 to your computer and use it in GitHub Desktop.
Wide-ResNet k=7, N=4, kernel=7
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) (None, 32, 32, 3) 0
__________________________________________________________________________________________________
conv2d (Conv2D) (None, 32, 32, 112) 448 input_1[0][0]
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 32, 32, 112) 614768 conv2d[0][0]
__________________________________________________________________________________________________
batch_normalization (BatchNorma (None, 32, 32, 112) 448 conv2d_1[0][0]
__________________________________________________________________________________________________
activation (Activation) (None, 32, 32, 112) 0 batch_normalization[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 32, 32, 112) 614768 activation[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 32, 32, 112) 448 conv2d_2[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 32, 32, 112) 0 batch_normalization_1[0][0]
__________________________________________________________________________________________________
add (Add) (None, 32, 32, 112) 0 conv2d[0][0]
activation_1[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 32, 32, 112) 614768 add[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 32, 32, 112) 448 conv2d_3[0][0]
__________________________________________________________________________________________________
activation_2 (Activation) (None, 32, 32, 112) 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, 32, 32, 112) 614768 activation_2[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 32, 32, 112) 448 conv2d_4[0][0]
__________________________________________________________________________________________________
activation_3 (Activation) (None, 32, 32, 112) 0 batch_normalization_3[0][0]
__________________________________________________________________________________________________
add_1 (Add) (None, 32, 32, 112) 0 add[0][0]
activation_3[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, 32, 32, 112) 614768 add_1[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 32, 32, 112) 448 conv2d_5[0][0]
__________________________________________________________________________________________________
activation_4 (Activation) (None, 32, 32, 112) 0 batch_normalization_4[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, 32, 32, 112) 614768 activation_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 32, 32, 112) 448 conv2d_6[0][0]
__________________________________________________________________________________________________
activation_5 (Activation) (None, 32, 32, 112) 0 batch_normalization_5[0][0]
__________________________________________________________________________________________________
add_2 (Add) (None, 32, 32, 112) 0 add_1[0][0]
activation_5[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, 32, 32, 112) 614768 add_2[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 32, 32, 112) 448 conv2d_7[0][0]
__________________________________________________________________________________________________
activation_6 (Activation) (None, 32, 32, 112) 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, 32, 32, 112) 614768 activation_6[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 32, 32, 112) 448 conv2d_8[0][0]
__________________________________________________________________________________________________
activation_7 (Activation) (None, 32, 32, 112) 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
add_3 (Add) (None, 32, 32, 112) 0 add_2[0][0]
activation_7[0][0]
__________________________________________________________________________________________________
average_pooling2d (AveragePooli (None, 16, 16, 112) 0 add_3[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, 16, 16, 224) 25312 average_pooling2d[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, 16, 16, 224) 2458848 conv2d_9[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 16, 16, 224) 896 conv2d_10[0][0]
__________________________________________________________________________________________________
activation_8 (Activation) (None, 16, 16, 224) 0 batch_normalization_8[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, 16, 16, 224) 2458848 activation_8[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 16, 16, 224) 896 conv2d_11[0][0]
__________________________________________________________________________________________________
activation_9 (Activation) (None, 16, 16, 224) 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
add_4 (Add) (None, 16, 16, 224) 0 conv2d_9[0][0]
activation_9[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, 16, 16, 224) 2458848 add_4[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 16, 16, 224) 896 conv2d_12[0][0]
__________________________________________________________________________________________________
activation_10 (Activation) (None, 16, 16, 224) 0 batch_normalization_10[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, 16, 16, 224) 2458848 activation_10[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 16, 16, 224) 896 conv2d_13[0][0]
__________________________________________________________________________________________________
activation_11 (Activation) (None, 16, 16, 224) 0 batch_normalization_11[0][0]
__________________________________________________________________________________________________
add_5 (Add) (None, 16, 16, 224) 0 add_4[0][0]
activation_11[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 16, 16, 224) 2458848 add_5[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 16, 16, 224) 896 conv2d_14[0][0]
__________________________________________________________________________________________________
activation_12 (Activation) (None, 16, 16, 224) 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, 16, 16, 224) 2458848 activation_12[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 16, 16, 224) 896 conv2d_15[0][0]
__________________________________________________________________________________________________
activation_13 (Activation) (None, 16, 16, 224) 0 batch_normalization_13[0][0]
__________________________________________________________________________________________________
add_6 (Add) (None, 16, 16, 224) 0 add_5[0][0]
activation_13[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, 16, 16, 224) 2458848 add_6[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 16, 16, 224) 896 conv2d_16[0][0]
__________________________________________________________________________________________________
activation_14 (Activation) (None, 16, 16, 224) 0 batch_normalization_14[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, 16, 16, 224) 2458848 activation_14[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 16, 16, 224) 896 conv2d_17[0][0]
__________________________________________________________________________________________________
activation_15 (Activation) (None, 16, 16, 224) 0 batch_normalization_15[0][0]
__________________________________________________________________________________________________
add_7 (Add) (None, 16, 16, 224) 0 add_6[0][0]
activation_15[0][0]
__________________________________________________________________________________________________
average_pooling2d_1 (AveragePoo (None, 8, 8, 224) 0 add_7[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, 8, 8, 448) 100800 average_pooling2d_1[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, 8, 8, 448) 9834944 conv2d_18[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 8, 8, 448) 1792 conv2d_19[0][0]
__________________________________________________________________________________________________
activation_16 (Activation) (None, 8, 8, 448) 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, 8, 8, 448) 9834944 activation_16[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 8, 8, 448) 1792 conv2d_20[0][0]
__________________________________________________________________________________________________
activation_17 (Activation) (None, 8, 8, 448) 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
add_8 (Add) (None, 8, 8, 448) 0 conv2d_18[0][0]
activation_17[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, 8, 8, 448) 9834944 add_8[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 8, 8, 448) 1792 conv2d_21[0][0]
__________________________________________________________________________________________________
activation_18 (Activation) (None, 8, 8, 448) 0 batch_normalization_18[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 8, 8, 448) 9834944 activation_18[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 8, 8, 448) 1792 conv2d_22[0][0]
__________________________________________________________________________________________________
activation_19 (Activation) (None, 8, 8, 448) 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
add_9 (Add) (None, 8, 8, 448) 0 add_8[0][0]
activation_19[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 8, 8, 448) 9834944 add_9[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 8, 8, 448) 1792 conv2d_23[0][0]
__________________________________________________________________________________________________
activation_20 (Activation) (None, 8, 8, 448) 0 batch_normalization_20[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 8, 8, 448) 9834944 activation_20[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 8, 8, 448) 1792 conv2d_24[0][0]
__________________________________________________________________________________________________
activation_21 (Activation) (None, 8, 8, 448) 0 batch_normalization_21[0][0]
__________________________________________________________________________________________________
add_10 (Add) (None, 8, 8, 448) 0 add_9[0][0]
activation_21[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 8, 8, 448) 9834944 add_10[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 8, 8, 448) 1792 conv2d_25[0][0]
__________________________________________________________________________________________________
activation_22 (Activation) (None, 8, 8, 448) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, 8, 8, 448) 9834944 activation_22[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 8, 8, 448) 1792 conv2d_26[0][0]
__________________________________________________________________________________________________
activation_23 (Activation) (None, 8, 8, 448) 0 batch_normalization_23[0][0]
__________________________________________________________________________________________________
add_11 (Add) (None, 8, 8, 448) 0 add_10[0][0]
activation_23[0][0]
__________________________________________________________________________________________________
global_average_pooling2d (Globa (None, 448) 0 add_11[0][0]
__________________________________________________________________________________________________
dense (Dense) (None, 100) 44900 global_average_pooling2d[0][0]
==================================================================================================
Total params: 103,465,028
Trainable params: 103,452,484
Non-trainable params: 12,544
__________________________________________________________________________________________________
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment