Skip to content

Instantly share code, notes, and snippets.

@f-rumblefish
Last active February 7, 2019 12:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save f-rumblefish/d7edc4370645894c0692cfb60f54398c to your computer and use it in GitHub Desktop.
Save f-rumblefish/d7edc4370645894c0692cfb60f54398c to your computer and use it in GitHub Desktop.
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) (None, 224, 224, 3) 0
_________________________________________________________________
conv1_pad (ZeroPadding2D) (None, 225, 225, 3) 0
_________________________________________________________________
conv1 (Conv2D) (None, 112, 112, 32) 864
_________________________________________________________________
conv1_bn (BatchNormalization (None, 112, 112, 32) 128
_________________________________________________________________
conv1_relu (ReLU) (None, 112, 112, 32) 0
_________________________________________________________________
conv_dw_1 (DepthwiseConv2D) (None, 112, 112, 32) 288
_________________________________________________________________
conv_dw_1_bn (BatchNormaliza (None, 112, 112, 32) 128
_________________________________________________________________
conv_dw_1_relu (ReLU) (None, 112, 112, 32) 0
_________________________________________________________________
conv_pw_1 (Conv2D) (None, 112, 112, 64) 2048
_________________________________________________________________
conv_pw_1_bn (BatchNormaliza (None, 112, 112, 64) 256
_________________________________________________________________
conv_pw_1_relu (ReLU) (None, 112, 112, 64) 0
_________________________________________________________________
conv_pad_2 (ZeroPadding2D) (None, 113, 113, 64) 0
_________________________________________________________________
conv_dw_2 (DepthwiseConv2D) (None, 56, 56, 64) 576
_________________________________________________________________
conv_dw_2_bn (BatchNormaliza (None, 56, 56, 64) 256
_________________________________________________________________
conv_dw_2_relu (ReLU) (None, 56, 56, 64) 0
_________________________________________________________________
conv_pw_2 (Conv2D) (None, 56, 56, 128) 8192
_________________________________________________________________
conv_pw_2_bn (BatchNormaliza (None, 56, 56, 128) 512
_________________________________________________________________
conv_pw_2_relu (ReLU) (None, 56, 56, 128) 0
_________________________________________________________________
conv_dw_3 (DepthwiseConv2D) (None, 56, 56, 128) 1152
_________________________________________________________________
conv_dw_3_bn (BatchNormaliza (None, 56, 56, 128) 512
_________________________________________________________________
conv_dw_3_relu (ReLU) (None, 56, 56, 128) 0
_________________________________________________________________
conv_pw_3 (Conv2D) (None, 56, 56, 128) 16384
_________________________________________________________________
conv_pw_3_bn (BatchNormaliza (None, 56, 56, 128) 512
_________________________________________________________________
conv_pw_3_relu (ReLU) (None, 56, 56, 128) 0
_________________________________________________________________
conv_pad_4 (ZeroPadding2D) (None, 57, 57, 128) 0
_________________________________________________________________
conv_dw_4 (DepthwiseConv2D) (None, 28, 28, 128) 1152
_________________________________________________________________
conv_dw_4_bn (BatchNormaliza (None, 28, 28, 128) 512
_________________________________________________________________
conv_dw_4_relu (ReLU) (None, 28, 28, 128) 0
_________________________________________________________________
conv_pw_4 (Conv2D) (None, 28, 28, 256) 32768
_________________________________________________________________
conv_pw_4_bn (BatchNormaliza (None, 28, 28, 256) 1024
_________________________________________________________________
conv_pw_4_relu (ReLU) (None, 28, 28, 256) 0
_________________________________________________________________
conv_dw_5 (DepthwiseConv2D) (None, 28, 28, 256) 2304
_________________________________________________________________
conv_dw_5_bn (BatchNormaliza (None, 28, 28, 256) 1024
_________________________________________________________________
conv_dw_5_relu (ReLU) (None, 28, 28, 256) 0
_________________________________________________________________
conv_pw_5 (Conv2D) (None, 28, 28, 256) 65536
_________________________________________________________________
conv_pw_5_bn (BatchNormaliza (None, 28, 28, 256) 1024
_________________________________________________________________
conv_pw_5_relu (ReLU) (None, 28, 28, 256) 0
_________________________________________________________________
conv_pad_6 (ZeroPadding2D) (None, 29, 29, 256) 0
_________________________________________________________________
conv_dw_6 (DepthwiseConv2D) (None, 14, 14, 256) 2304
_________________________________________________________________
conv_dw_6_bn (BatchNormaliza (None, 14, 14, 256) 1024
_________________________________________________________________
conv_dw_6_relu (ReLU) (None, 14, 14, 256) 0
_________________________________________________________________
conv_pw_6 (Conv2D) (None, 14, 14, 512) 131072
_________________________________________________________________
conv_pw_6_bn (BatchNormaliza (None, 14, 14, 512) 2048
_________________________________________________________________
conv_pw_6_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_dw_7 (DepthwiseConv2D) (None, 14, 14, 512) 4608
_________________________________________________________________
conv_dw_7_bn (BatchNormaliza (None, 14, 14, 512) 2048
_________________________________________________________________
conv_dw_7_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_pw_7 (Conv2D) (None, 14, 14, 512) 262144
_________________________________________________________________
conv_pw_7_bn (BatchNormaliza (None, 14, 14, 512) 2048
_________________________________________________________________
conv_pw_7_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_dw_8 (DepthwiseConv2D) (None, 14, 14, 512) 4608
_________________________________________________________________
conv_dw_8_bn (BatchNormaliza (None, 14, 14, 512) 2048
_________________________________________________________________
conv_dw_8_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_pw_8 (Conv2D) (None, 14, 14, 512) 262144
_________________________________________________________________
conv_pw_8_bn (BatchNormaliza (None, 14, 14, 512) 2048
_________________________________________________________________
conv_pw_8_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_dw_9 (DepthwiseConv2D) (None, 14, 14, 512) 4608
_________________________________________________________________
conv_dw_9_bn (BatchNormaliza (None, 14, 14, 512) 2048
_________________________________________________________________
conv_dw_9_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_pw_9 (Conv2D) (None, 14, 14, 512) 262144
_________________________________________________________________
conv_pw_9_bn (BatchNormaliza (None, 14, 14, 512) 2048
_________________________________________________________________
conv_pw_9_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_dw_10 (DepthwiseConv2D) (None, 14, 14, 512) 4608
_________________________________________________________________
conv_dw_10_bn (BatchNormaliz (None, 14, 14, 512) 2048
_________________________________________________________________
conv_dw_10_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_pw_10 (Conv2D) (None, 14, 14, 512) 262144
_________________________________________________________________
conv_pw_10_bn (BatchNormaliz (None, 14, 14, 512) 2048
_________________________________________________________________
conv_pw_10_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_dw_11 (DepthwiseConv2D) (None, 14, 14, 512) 4608
_________________________________________________________________
conv_dw_11_bn (BatchNormaliz (None, 14, 14, 512) 2048
_________________________________________________________________
conv_dw_11_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_pw_11 (Conv2D) (None, 14, 14, 512) 262144
_________________________________________________________________
conv_pw_11_bn (BatchNormaliz (None, 14, 14, 512) 2048
_________________________________________________________________
conv_pw_11_relu (ReLU) (None, 14, 14, 512) 0
_________________________________________________________________
conv_pad_12 (ZeroPadding2D) (None, 15, 15, 512) 0
_________________________________________________________________
conv_dw_12 (DepthwiseConv2D) (None, 7, 7, 512) 4608
_________________________________________________________________
conv_dw_12_bn (BatchNormaliz (None, 7, 7, 512) 2048
_________________________________________________________________
conv_dw_12_relu (ReLU) (None, 7, 7, 512) 0
_________________________________________________________________
conv_pw_12 (Conv2D) (None, 7, 7, 1024) 524288
_________________________________________________________________
conv_pw_12_bn (BatchNormaliz (None, 7, 7, 1024) 4096
_________________________________________________________________
conv_pw_12_relu (ReLU) (None, 7, 7, 1024) 0
_________________________________________________________________
conv_dw_13 (DepthwiseConv2D) (None, 7, 7, 1024) 9216
_________________________________________________________________
conv_dw_13_bn (BatchNormaliz (None, 7, 7, 1024) 4096
_________________________________________________________________
conv_dw_13_relu (ReLU) (None, 7, 7, 1024) 0
_________________________________________________________________
conv_pw_13 (Conv2D) (None, 7, 7, 1024) 1048576
_________________________________________________________________
conv_pw_13_bn (BatchNormaliz (None, 7, 7, 1024) 4096
_________________________________________________________________
conv_pw_13_relu (ReLU) (None, 7, 7, 1024) 0
_________________________________________________________________
global_average_pooling2d_1 ( (None, 1024) 0
_________________________________________________________________
dense_1 (Dense) (None, 512) 524800
_________________________________________________________________
dense_2 (Dense) (None, 6) 3078
=================================================================
Total params: 3,756,742
Trainable params: 3,734,854
Non-trainable params: 21,888
_________________________________________________________________
Train on 1620 samples, validate on 180 samples
Epoch 1/20
1620/1620 [==============================] - 2212s 1s/step - loss: 0.3284 - acc: 0.9154 - val_loss: 0.0902 - val_acc: 0.9833
Epoch 2/20
1620/1620 [==============================] - 2174s 1s/step - loss: 0.1294 - acc: 0.9728 - val_loss: 0.3315 - val_acc: 0.9444
Epoch 3/20
1620/1620 [==============================] - 2207s 1s/step - loss: 0.0638 - acc: 0.9858 - val_loss: 0.0677 - val_acc: 0.9833
Epoch 4/20
1620/1620 [==============================] - 2344s 1s/step - loss: 0.0150 - acc: 0.9957 - val_loss: 0.0578 - val_acc: 0.9944
Epoch 5/20
1620/1620 [==============================] - 2226s 1s/step - loss: 0.1021 - acc: 0.9827 - val_loss: 1.9420 - val_acc: 0.8222
Epoch 6/20
1620/1620 [==============================] - 2287s 1s/step - loss: 0.1077 - acc: 0.9858 - val_loss: 3.6149 - val_acc: 0.7111
Epoch 7/20
1620/1620 [==============================] - 2297s 1s/step - loss: 0.1274 - acc: 0.9765 - val_loss: 4.5758 - val_acc: 0.5222
Epoch 8/20
1620/1620 [==============================] - 2331s 1s/step - loss: 0.0570 - acc: 0.9883 - val_loss: 1.0351 - val_acc: 0.8833
Epoch 9/20
1620/1620 [==============================] - 2291s 1s/step - loss: 0.1577 - acc: 0.9648 - val_loss: 6.1326 - val_acc: 0.5056
Epoch 10/20
1620/1620 [==============================] - 2244s 1s/step - loss: 0.0935 - acc: 0.9784 - val_loss: 6.2652e-05 - val_acc: 1.0000
Epoch 11/20
1620/1620 [==============================] - 2159s 1s/step - loss: 0.0407 - acc: 0.9883 - val_loss: 0.1965 - val_acc: 0.9722
Epoch 12/20
1620/1620 [==============================] - 2161s 1s/step - loss: 0.0400 - acc: 0.9938 - val_loss: 0.1242 - val_acc: 0.9667
Epoch 13/20
1620/1620 [==============================] - 2140s 1s/step - loss: 0.0045 - acc: 0.9988 - val_loss: 0.1423 - val_acc: 0.9667
Epoch 14/20
1620/1620 [==============================] - 2145s 1s/step - loss: 0.0459 - acc: 0.9895 - val_loss: 0.0011 - val_acc: 1.0000
Epoch 15/20
1620/1620 [==============================] - 2374s 1s/step - loss: 0.0462 - acc: 0.9920 - val_loss: 0.5006 - val_acc: 0.9500
Epoch 16/20
1620/1620 [==============================] - 2300s 1s/step - loss: 0.0083 - acc: 0.9981 - val_loss: 0.0740 - val_acc: 0.9889
Epoch 17/20
1620/1620 [==============================] - 2312s 1s/step - loss: 0.0237 - acc: 0.9944 - val_loss: 2.5867 - val_acc: 0.7444
Epoch 18/20
1620/1620 [==============================] - 2516s 2s/step - loss: 0.0487 - acc: 0.9883 - val_loss: 0.6438 - val_acc: 0.8889
Epoch 19/20
1620/1620 [==============================] - 2691s 2s/step - loss: 0.0108 - acc: 0.9969 - val_loss: 0.4914 - val_acc: 0.9111
Epoch 20/20
1620/1620 [==============================] - 2366s 1s/step - loss: 0.0052 - acc: 0.9988 - val_loss: 0.3683 - val_acc: 0.9500
>>> runtime = 45781.69272732735 seconds
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment