Skip to content

Instantly share code, notes, and snippets.

@naruarjun
Created August 24, 2018 18:33
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save naruarjun/8cf4f4924d28c0e1b65ad85e0fe08095 to your computer and use it in GitHub Desktop.
Save naruarjun/8cf4f4924d28c0e1b65ad85e0fe08095 to your computer and use it in GitHub Desktop.
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Pretrained Classification Model\n",
"This is a pretrained classification model to classify the natural images dataset. The pretrained network used is InceptionV3 and the deep learning framework used is keras. This model achieved a validation accuracy of about 98.32%. The Model is basically the InceptionV3 architecture with two dense layers on top of it. The output of the following cell contains the entire architecture of the network summarised. I manually divided the data into two folders. Natural_images contained my training data and natural_images_validation contains my validation data"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model loaded.\n",
"InceptionV3 Model Summary:\n",
"__________________________________________________________________________________________________\n",
"Layer (type) Output Shape Param # Connected to \n",
"==================================================================================================\n",
"input_2 (InputLayer) (None, 139, 139, 3) 0 \n",
"__________________________________________________________________________________________________\n",
"conv2d_95 (Conv2D) (None, 69, 69, 32) 864 input_2[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_95 (BatchNo (None, 69, 69, 32) 96 conv2d_95[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_95 (Activation) (None, 69, 69, 32) 0 batch_normalization_95[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_96 (Conv2D) (None, 67, 67, 32) 9216 activation_95[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_96 (BatchNo (None, 67, 67, 32) 96 conv2d_96[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_96 (Activation) (None, 67, 67, 32) 0 batch_normalization_96[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_97 (Conv2D) (None, 67, 67, 64) 18432 activation_96[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_97 (BatchNo (None, 67, 67, 64) 192 conv2d_97[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_97 (Activation) (None, 67, 67, 64) 0 batch_normalization_97[0][0] \n",
"__________________________________________________________________________________________________\n",
"max_pooling2d_5 (MaxPooling2D) (None, 33, 33, 64) 0 activation_97[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_98 (Conv2D) (None, 33, 33, 80) 5120 max_pooling2d_5[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_98 (BatchNo (None, 33, 33, 80) 240 conv2d_98[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_98 (Activation) (None, 33, 33, 80) 0 batch_normalization_98[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_99 (Conv2D) (None, 31, 31, 192) 138240 activation_98[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_99 (BatchNo (None, 31, 31, 192) 576 conv2d_99[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_99 (Activation) (None, 31, 31, 192) 0 batch_normalization_99[0][0] \n",
"__________________________________________________________________________________________________\n",
"max_pooling2d_6 (MaxPooling2D) (None, 15, 15, 192) 0 activation_99[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_103 (Conv2D) (None, 15, 15, 64) 12288 max_pooling2d_6[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_103 (BatchN (None, 15, 15, 64) 192 conv2d_103[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_103 (Activation) (None, 15, 15, 64) 0 batch_normalization_103[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_101 (Conv2D) (None, 15, 15, 48) 9216 max_pooling2d_6[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_104 (Conv2D) (None, 15, 15, 96) 55296 activation_103[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_101 (BatchN (None, 15, 15, 48) 144 conv2d_101[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_104 (BatchN (None, 15, 15, 96) 288 conv2d_104[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_101 (Activation) (None, 15, 15, 48) 0 batch_normalization_101[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_104 (Activation) (None, 15, 15, 96) 0 batch_normalization_104[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_10 (AveragePo (None, 15, 15, 192) 0 max_pooling2d_6[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_100 (Conv2D) (None, 15, 15, 64) 12288 max_pooling2d_6[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_102 (Conv2D) (None, 15, 15, 64) 76800 activation_101[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_105 (Conv2D) (None, 15, 15, 96) 82944 activation_104[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_106 (Conv2D) (None, 15, 15, 32) 6144 average_pooling2d_10[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_100 (BatchN (None, 15, 15, 64) 192 conv2d_100[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_102 (BatchN (None, 15, 15, 64) 192 conv2d_102[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_105 (BatchN (None, 15, 15, 96) 288 conv2d_105[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_106 (BatchN (None, 15, 15, 32) 96 conv2d_106[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_100 (Activation) (None, 15, 15, 64) 0 batch_normalization_100[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_102 (Activation) (None, 15, 15, 64) 0 batch_normalization_102[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_105 (Activation) (None, 15, 15, 96) 0 batch_normalization_105[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_106 (Activation) (None, 15, 15, 32) 0 batch_normalization_106[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed0 (Concatenate) (None, 15, 15, 256) 0 activation_100[0][0] \n",
" activation_102[0][0] \n",
" activation_105[0][0] \n",
" activation_106[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_110 (Conv2D) (None, 15, 15, 64) 16384 mixed0[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_110 (BatchN (None, 15, 15, 64) 192 conv2d_110[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_110 (Activation) (None, 15, 15, 64) 0 batch_normalization_110[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_108 (Conv2D) (None, 15, 15, 48) 12288 mixed0[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_111 (Conv2D) (None, 15, 15, 96) 55296 activation_110[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_108 (BatchN (None, 15, 15, 48) 144 conv2d_108[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_111 (BatchN (None, 15, 15, 96) 288 conv2d_111[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_108 (Activation) (None, 15, 15, 48) 0 batch_normalization_108[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_111 (Activation) (None, 15, 15, 96) 0 batch_normalization_111[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_11 (AveragePo (None, 15, 15, 256) 0 mixed0[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_107 (Conv2D) (None, 15, 15, 64) 16384 mixed0[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_109 (Conv2D) (None, 15, 15, 64) 76800 activation_108[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_112 (Conv2D) (None, 15, 15, 96) 82944 activation_111[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_113 (Conv2D) (None, 15, 15, 64) 16384 average_pooling2d_11[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_107 (BatchN (None, 15, 15, 64) 192 conv2d_107[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_109 (BatchN (None, 15, 15, 64) 192 conv2d_109[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_112 (BatchN (None, 15, 15, 96) 288 conv2d_112[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_113 (BatchN (None, 15, 15, 64) 192 conv2d_113[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_107 (Activation) (None, 15, 15, 64) 0 batch_normalization_107[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_109 (Activation) (None, 15, 15, 64) 0 batch_normalization_109[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_112 (Activation) (None, 15, 15, 96) 0 batch_normalization_112[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_113 (Activation) (None, 15, 15, 64) 0 batch_normalization_113[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed1 (Concatenate) (None, 15, 15, 288) 0 activation_107[0][0] \n",
" activation_109[0][0] \n",
" activation_112[0][0] \n",
" activation_113[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_117 (Conv2D) (None, 15, 15, 64) 18432 mixed1[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_117 (BatchN (None, 15, 15, 64) 192 conv2d_117[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_117 (Activation) (None, 15, 15, 64) 0 batch_normalization_117[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_115 (Conv2D) (None, 15, 15, 48) 13824 mixed1[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_118 (Conv2D) (None, 15, 15, 96) 55296 activation_117[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_115 (BatchN (None, 15, 15, 48) 144 conv2d_115[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_118 (BatchN (None, 15, 15, 96) 288 conv2d_118[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_115 (Activation) (None, 15, 15, 48) 0 batch_normalization_115[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_118 (Activation) (None, 15, 15, 96) 0 batch_normalization_118[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_12 (AveragePo (None, 15, 15, 288) 0 mixed1[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_114 (Conv2D) (None, 15, 15, 64) 18432 mixed1[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_116 (Conv2D) (None, 15, 15, 64) 76800 activation_115[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_119 (Conv2D) (None, 15, 15, 96) 82944 activation_118[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_120 (Conv2D) (None, 15, 15, 64) 18432 average_pooling2d_12[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_114 (BatchN (None, 15, 15, 64) 192 conv2d_114[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_116 (BatchN (None, 15, 15, 64) 192 conv2d_116[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_119 (BatchN (None, 15, 15, 96) 288 conv2d_119[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_120 (BatchN (None, 15, 15, 64) 192 conv2d_120[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_114 (Activation) (None, 15, 15, 64) 0 batch_normalization_114[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_116 (Activation) (None, 15, 15, 64) 0 batch_normalization_116[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_119 (Activation) (None, 15, 15, 96) 0 batch_normalization_119[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_120 (Activation) (None, 15, 15, 64) 0 batch_normalization_120[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed2 (Concatenate) (None, 15, 15, 288) 0 activation_114[0][0] \n",
" activation_116[0][0] \n",
" activation_119[0][0] \n",
" activation_120[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_122 (Conv2D) (None, 15, 15, 64) 18432 mixed2[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_122 (BatchN (None, 15, 15, 64) 192 conv2d_122[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_122 (Activation) (None, 15, 15, 64) 0 batch_normalization_122[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_123 (Conv2D) (None, 15, 15, 96) 55296 activation_122[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_123 (BatchN (None, 15, 15, 96) 288 conv2d_123[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_123 (Activation) (None, 15, 15, 96) 0 batch_normalization_123[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_121 (Conv2D) (None, 7, 7, 384) 995328 mixed2[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_124 (Conv2D) (None, 7, 7, 96) 82944 activation_123[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_121 (BatchN (None, 7, 7, 384) 1152 conv2d_121[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_124 (BatchN (None, 7, 7, 96) 288 conv2d_124[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_121 (Activation) (None, 7, 7, 384) 0 batch_normalization_121[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_124 (Activation) (None, 7, 7, 96) 0 batch_normalization_124[0][0] \n",
"__________________________________________________________________________________________________\n",
"max_pooling2d_7 (MaxPooling2D) (None, 7, 7, 288) 0 mixed2[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed3 (Concatenate) (None, 7, 7, 768) 0 activation_121[0][0] \n",
" activation_124[0][0] \n",
" max_pooling2d_7[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_129 (Conv2D) (None, 7, 7, 128) 98304 mixed3[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_129 (BatchN (None, 7, 7, 128) 384 conv2d_129[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_129 (Activation) (None, 7, 7, 128) 0 batch_normalization_129[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_130 (Conv2D) (None, 7, 7, 128) 114688 activation_129[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_130 (BatchN (None, 7, 7, 128) 384 conv2d_130[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_130 (Activation) (None, 7, 7, 128) 0 batch_normalization_130[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_126 (Conv2D) (None, 7, 7, 128) 98304 mixed3[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_131 (Conv2D) (None, 7, 7, 128) 114688 activation_130[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_126 (BatchN (None, 7, 7, 128) 384 conv2d_126[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_131 (BatchN (None, 7, 7, 128) 384 conv2d_131[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_126 (Activation) (None, 7, 7, 128) 0 batch_normalization_126[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_131 (Activation) (None, 7, 7, 128) 0 batch_normalization_131[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_127 (Conv2D) (None, 7, 7, 128) 114688 activation_126[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_132 (Conv2D) (None, 7, 7, 128) 114688 activation_131[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_127 (BatchN (None, 7, 7, 128) 384 conv2d_127[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_132 (BatchN (None, 7, 7, 128) 384 conv2d_132[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_127 (Activation) (None, 7, 7, 128) 0 batch_normalization_127[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_132 (Activation) (None, 7, 7, 128) 0 batch_normalization_132[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_13 (AveragePo (None, 7, 7, 768) 0 mixed3[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_125 (Conv2D) (None, 7, 7, 192) 147456 mixed3[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_128 (Conv2D) (None, 7, 7, 192) 172032 activation_127[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_133 (Conv2D) (None, 7, 7, 192) 172032 activation_132[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_134 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_13[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_125 (BatchN (None, 7, 7, 192) 576 conv2d_125[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_128 (BatchN (None, 7, 7, 192) 576 conv2d_128[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_133 (BatchN (None, 7, 7, 192) 576 conv2d_133[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_134 (BatchN (None, 7, 7, 192) 576 conv2d_134[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_125 (Activation) (None, 7, 7, 192) 0 batch_normalization_125[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_128 (Activation) (None, 7, 7, 192) 0 batch_normalization_128[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_133 (Activation) (None, 7, 7, 192) 0 batch_normalization_133[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_134 (Activation) (None, 7, 7, 192) 0 batch_normalization_134[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed4 (Concatenate) (None, 7, 7, 768) 0 activation_125[0][0] \n",
" activation_128[0][0] \n",
" activation_133[0][0] \n",
" activation_134[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_139 (Conv2D) (None, 7, 7, 160) 122880 mixed4[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_139 (BatchN (None, 7, 7, 160) 480 conv2d_139[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_139 (Activation) (None, 7, 7, 160) 0 batch_normalization_139[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_140 (Conv2D) (None, 7, 7, 160) 179200 activation_139[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_140 (BatchN (None, 7, 7, 160) 480 conv2d_140[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_140 (Activation) (None, 7, 7, 160) 0 batch_normalization_140[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_136 (Conv2D) (None, 7, 7, 160) 122880 mixed4[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_141 (Conv2D) (None, 7, 7, 160) 179200 activation_140[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_136 (BatchN (None, 7, 7, 160) 480 conv2d_136[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_141 (BatchN (None, 7, 7, 160) 480 conv2d_141[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_136 (Activation) (None, 7, 7, 160) 0 batch_normalization_136[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_141 (Activation) (None, 7, 7, 160) 0 batch_normalization_141[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_137 (Conv2D) (None, 7, 7, 160) 179200 activation_136[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_142 (Conv2D) (None, 7, 7, 160) 179200 activation_141[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_137 (BatchN (None, 7, 7, 160) 480 conv2d_137[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_142 (BatchN (None, 7, 7, 160) 480 conv2d_142[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_137 (Activation) (None, 7, 7, 160) 0 batch_normalization_137[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_142 (Activation) (None, 7, 7, 160) 0 batch_normalization_142[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_14 (AveragePo (None, 7, 7, 768) 0 mixed4[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_135 (Conv2D) (None, 7, 7, 192) 147456 mixed4[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_138 (Conv2D) (None, 7, 7, 192) 215040 activation_137[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_143 (Conv2D) (None, 7, 7, 192) 215040 activation_142[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_144 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_14[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_135 (BatchN (None, 7, 7, 192) 576 conv2d_135[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_138 (BatchN (None, 7, 7, 192) 576 conv2d_138[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_143 (BatchN (None, 7, 7, 192) 576 conv2d_143[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_144 (BatchN (None, 7, 7, 192) 576 conv2d_144[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_135 (Activation) (None, 7, 7, 192) 0 batch_normalization_135[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_138 (Activation) (None, 7, 7, 192) 0 batch_normalization_138[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_143 (Activation) (None, 7, 7, 192) 0 batch_normalization_143[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_144 (Activation) (None, 7, 7, 192) 0 batch_normalization_144[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed5 (Concatenate) (None, 7, 7, 768) 0 activation_135[0][0] \n",
" activation_138[0][0] \n",
" activation_143[0][0] \n",
" activation_144[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_149 (Conv2D) (None, 7, 7, 160) 122880 mixed5[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_149 (BatchN (None, 7, 7, 160) 480 conv2d_149[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_149 (Activation) (None, 7, 7, 160) 0 batch_normalization_149[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_150 (Conv2D) (None, 7, 7, 160) 179200 activation_149[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_150 (BatchN (None, 7, 7, 160) 480 conv2d_150[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_150 (Activation) (None, 7, 7, 160) 0 batch_normalization_150[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_146 (Conv2D) (None, 7, 7, 160) 122880 mixed5[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_151 (Conv2D) (None, 7, 7, 160) 179200 activation_150[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_146 (BatchN (None, 7, 7, 160) 480 conv2d_146[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_151 (BatchN (None, 7, 7, 160) 480 conv2d_151[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_146 (Activation) (None, 7, 7, 160) 0 batch_normalization_146[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_151 (Activation) (None, 7, 7, 160) 0 batch_normalization_151[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_147 (Conv2D) (None, 7, 7, 160) 179200 activation_146[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_152 (Conv2D) (None, 7, 7, 160) 179200 activation_151[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_147 (BatchN (None, 7, 7, 160) 480 conv2d_147[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_152 (BatchN (None, 7, 7, 160) 480 conv2d_152[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_147 (Activation) (None, 7, 7, 160) 0 batch_normalization_147[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_152 (Activation) (None, 7, 7, 160) 0 batch_normalization_152[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_15 (AveragePo (None, 7, 7, 768) 0 mixed5[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_145 (Conv2D) (None, 7, 7, 192) 147456 mixed5[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_148 (Conv2D) (None, 7, 7, 192) 215040 activation_147[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_153 (Conv2D) (None, 7, 7, 192) 215040 activation_152[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_154 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_15[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_145 (BatchN (None, 7, 7, 192) 576 conv2d_145[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_148 (BatchN (None, 7, 7, 192) 576 conv2d_148[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_153 (BatchN (None, 7, 7, 192) 576 conv2d_153[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_154 (BatchN (None, 7, 7, 192) 576 conv2d_154[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_145 (Activation) (None, 7, 7, 192) 0 batch_normalization_145[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_148 (Activation) (None, 7, 7, 192) 0 batch_normalization_148[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_153 (Activation) (None, 7, 7, 192) 0 batch_normalization_153[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_154 (Activation) (None, 7, 7, 192) 0 batch_normalization_154[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed6 (Concatenate) (None, 7, 7, 768) 0 activation_145[0][0] \n",
" activation_148[0][0] \n",
" activation_153[0][0] \n",
" activation_154[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_159 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_159 (BatchN (None, 7, 7, 192) 576 conv2d_159[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_159 (Activation) (None, 7, 7, 192) 0 batch_normalization_159[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_160 (Conv2D) (None, 7, 7, 192) 258048 activation_159[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_160 (BatchN (None, 7, 7, 192) 576 conv2d_160[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_160 (Activation) (None, 7, 7, 192) 0 batch_normalization_160[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_156 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_161 (Conv2D) (None, 7, 7, 192) 258048 activation_160[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_156 (BatchN (None, 7, 7, 192) 576 conv2d_156[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_161 (BatchN (None, 7, 7, 192) 576 conv2d_161[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_156 (Activation) (None, 7, 7, 192) 0 batch_normalization_156[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_161 (Activation) (None, 7, 7, 192) 0 batch_normalization_161[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_157 (Conv2D) (None, 7, 7, 192) 258048 activation_156[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_162 (Conv2D) (None, 7, 7, 192) 258048 activation_161[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_157 (BatchN (None, 7, 7, 192) 576 conv2d_157[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_162 (BatchN (None, 7, 7, 192) 576 conv2d_162[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_157 (Activation) (None, 7, 7, 192) 0 batch_normalization_157[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_162 (Activation) (None, 7, 7, 192) 0 batch_normalization_162[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_16 (AveragePo (None, 7, 7, 768) 0 mixed6[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_155 (Conv2D) (None, 7, 7, 192) 147456 mixed6[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_158 (Conv2D) (None, 7, 7, 192) 258048 activation_157[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_163 (Conv2D) (None, 7, 7, 192) 258048 activation_162[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_164 (Conv2D) (None, 7, 7, 192) 147456 average_pooling2d_16[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_155 (BatchN (None, 7, 7, 192) 576 conv2d_155[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_158 (BatchN (None, 7, 7, 192) 576 conv2d_158[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_163 (BatchN (None, 7, 7, 192) 576 conv2d_163[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_164 (BatchN (None, 7, 7, 192) 576 conv2d_164[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_155 (Activation) (None, 7, 7, 192) 0 batch_normalization_155[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_158 (Activation) (None, 7, 7, 192) 0 batch_normalization_158[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_163 (Activation) (None, 7, 7, 192) 0 batch_normalization_163[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_164 (Activation) (None, 7, 7, 192) 0 batch_normalization_164[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed7 (Concatenate) (None, 7, 7, 768) 0 activation_155[0][0] \n",
" activation_158[0][0] \n",
" activation_163[0][0] \n",
" activation_164[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_167 (Conv2D) (None, 7, 7, 192) 147456 mixed7[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_167 (BatchN (None, 7, 7, 192) 576 conv2d_167[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_167 (Activation) (None, 7, 7, 192) 0 batch_normalization_167[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_168 (Conv2D) (None, 7, 7, 192) 258048 activation_167[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_168 (BatchN (None, 7, 7, 192) 576 conv2d_168[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_168 (Activation) (None, 7, 7, 192) 0 batch_normalization_168[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_165 (Conv2D) (None, 7, 7, 192) 147456 mixed7[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_169 (Conv2D) (None, 7, 7, 192) 258048 activation_168[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_165 (BatchN (None, 7, 7, 192) 576 conv2d_165[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_169 (BatchN (None, 7, 7, 192) 576 conv2d_169[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_165 (Activation) (None, 7, 7, 192) 0 batch_normalization_165[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_169 (Activation) (None, 7, 7, 192) 0 batch_normalization_169[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_166 (Conv2D) (None, 3, 3, 320) 552960 activation_165[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_170 (Conv2D) (None, 3, 3, 192) 331776 activation_169[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_166 (BatchN (None, 3, 3, 320) 960 conv2d_166[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_170 (BatchN (None, 3, 3, 192) 576 conv2d_170[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_166 (Activation) (None, 3, 3, 320) 0 batch_normalization_166[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_170 (Activation) (None, 3, 3, 192) 0 batch_normalization_170[0][0] \n",
"__________________________________________________________________________________________________\n",
"max_pooling2d_8 (MaxPooling2D) (None, 3, 3, 768) 0 mixed7[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed8 (Concatenate) (None, 3, 3, 1280) 0 activation_166[0][0] \n",
" activation_170[0][0] \n",
" max_pooling2d_8[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_175 (Conv2D) (None, 3, 3, 448) 573440 mixed8[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_175 (BatchN (None, 3, 3, 448) 1344 conv2d_175[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_175 (Activation) (None, 3, 3, 448) 0 batch_normalization_175[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_172 (Conv2D) (None, 3, 3, 384) 491520 mixed8[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_176 (Conv2D) (None, 3, 3, 384) 1548288 activation_175[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_172 (BatchN (None, 3, 3, 384) 1152 conv2d_172[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_176 (BatchN (None, 3, 3, 384) 1152 conv2d_176[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_172 (Activation) (None, 3, 3, 384) 0 batch_normalization_172[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_176 (Activation) (None, 3, 3, 384) 0 batch_normalization_176[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_173 (Conv2D) (None, 3, 3, 384) 442368 activation_172[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_174 (Conv2D) (None, 3, 3, 384) 442368 activation_172[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_177 (Conv2D) (None, 3, 3, 384) 442368 activation_176[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_178 (Conv2D) (None, 3, 3, 384) 442368 activation_176[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_17 (AveragePo (None, 3, 3, 1280) 0 mixed8[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_171 (Conv2D) (None, 3, 3, 320) 409600 mixed8[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_173 (BatchN (None, 3, 3, 384) 1152 conv2d_173[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_174 (BatchN (None, 3, 3, 384) 1152 conv2d_174[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_177 (BatchN (None, 3, 3, 384) 1152 conv2d_177[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_178 (BatchN (None, 3, 3, 384) 1152 conv2d_178[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_179 (Conv2D) (None, 3, 3, 192) 245760 average_pooling2d_17[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_171 (BatchN (None, 3, 3, 320) 960 conv2d_171[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_173 (Activation) (None, 3, 3, 384) 0 batch_normalization_173[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_174 (Activation) (None, 3, 3, 384) 0 batch_normalization_174[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_177 (Activation) (None, 3, 3, 384) 0 batch_normalization_177[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_178 (Activation) (None, 3, 3, 384) 0 batch_normalization_178[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_179 (BatchN (None, 3, 3, 192) 576 conv2d_179[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_171 (Activation) (None, 3, 3, 320) 0 batch_normalization_171[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed9_0 (Concatenate) (None, 3, 3, 768) 0 activation_173[0][0] \n",
" activation_174[0][0] \n",
"__________________________________________________________________________________________________\n",
"concatenate_3 (Concatenate) (None, 3, 3, 768) 0 activation_177[0][0] \n",
" activation_178[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_179 (Activation) (None, 3, 3, 192) 0 batch_normalization_179[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed9 (Concatenate) (None, 3, 3, 2048) 0 activation_171[0][0] \n",
" mixed9_0[0][0] \n",
" concatenate_3[0][0] \n",
" activation_179[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_184 (Conv2D) (None, 3, 3, 448) 917504 mixed9[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_184 (BatchN (None, 3, 3, 448) 1344 conv2d_184[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_184 (Activation) (None, 3, 3, 448) 0 batch_normalization_184[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_181 (Conv2D) (None, 3, 3, 384) 786432 mixed9[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_185 (Conv2D) (None, 3, 3, 384) 1548288 activation_184[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_181 (BatchN (None, 3, 3, 384) 1152 conv2d_181[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_185 (BatchN (None, 3, 3, 384) 1152 conv2d_185[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_181 (Activation) (None, 3, 3, 384) 0 batch_normalization_181[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_185 (Activation) (None, 3, 3, 384) 0 batch_normalization_185[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_182 (Conv2D) (None, 3, 3, 384) 442368 activation_181[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_183 (Conv2D) (None, 3, 3, 384) 442368 activation_181[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_186 (Conv2D) (None, 3, 3, 384) 442368 activation_185[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_187 (Conv2D) (None, 3, 3, 384) 442368 activation_185[0][0] \n",
"__________________________________________________________________________________________________\n",
"average_pooling2d_18 (AveragePo (None, 3, 3, 2048) 0 mixed9[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_180 (Conv2D) (None, 3, 3, 320) 655360 mixed9[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_182 (BatchN (None, 3, 3, 384) 1152 conv2d_182[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_183 (BatchN (None, 3, 3, 384) 1152 conv2d_183[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_186 (BatchN (None, 3, 3, 384) 1152 conv2d_186[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_187 (BatchN (None, 3, 3, 384) 1152 conv2d_187[0][0] \n",
"__________________________________________________________________________________________________\n",
"conv2d_188 (Conv2D) (None, 3, 3, 192) 393216 average_pooling2d_18[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_180 (BatchN (None, 3, 3, 320) 960 conv2d_180[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_182 (Activation) (None, 3, 3, 384) 0 batch_normalization_182[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_183 (Activation) (None, 3, 3, 384) 0 batch_normalization_183[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_186 (Activation) (None, 3, 3, 384) 0 batch_normalization_186[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_187 (Activation) (None, 3, 3, 384) 0 batch_normalization_187[0][0] \n",
"__________________________________________________________________________________________________\n",
"batch_normalization_188 (BatchN (None, 3, 3, 192) 576 conv2d_188[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_180 (Activation) (None, 3, 3, 320) 0 batch_normalization_180[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed9_1 (Concatenate) (None, 3, 3, 768) 0 activation_182[0][0] \n",
" activation_183[0][0] \n",
"__________________________________________________________________________________________________\n",
"concatenate_4 (Concatenate) (None, 3, 3, 768) 0 activation_186[0][0] \n",
" activation_187[0][0] \n",
"__________________________________________________________________________________________________\n",
"activation_188 (Activation) (None, 3, 3, 192) 0 batch_normalization_188[0][0] \n",
"__________________________________________________________________________________________________\n",
"mixed10 (Concatenate) (None, 3, 3, 2048) 0 activation_180[0][0] \n",
" mixed9_1[0][0] \n",
" concatenate_4[0][0] \n",
" activation_188[0][0] \n",
"==================================================================================================\n",
"Total params: 21,802,784\n",
"Trainable params: 0\n",
"Non-trainable params: 21,802,784\n",
"__________________________________________________________________________________________________\n",
"None\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Final Model Summary:\n",
"_________________________________________________________________\n",
"Layer (type) Output Shape Param # \n",
"=================================================================\n",
"inception_v3 (Model) (None, 3, 3, 2048) 21802784 \n",
"_________________________________________________________________\n",
"sequential_4 (Sequential) (None, 8) 4720904 \n",
"=================================================================\n",
"Total params: 26,523,688\n",
"Trainable params: 4,720,904\n",
"Non-trainable params: 21,802,784\n",
"_________________________________________________________________\n",
"None\n",
"Found 4754 images belonging to 8 classes.\n",
"Found 2145 images belonging to 8 classes.\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"/usr/local/lib/python3.5/dist-packages/ipykernel_launcher.py:82: UserWarning: The semantics of the Keras 2 argument `steps_per_epoch` is not the same as the Keras 1 argument `samples_per_epoch`. `steps_per_epoch` is the number of batches to draw from the generator at each epoch. Basically steps_per_epoch = samples_per_epoch/batch_size. Similarly `nb_val_samples`->`validation_steps` and `val_samples`->`steps` arguments have changed. Update your method calls accordingly.\n",
"/usr/local/lib/python3.5/dist-packages/ipykernel_launcher.py:82: UserWarning: Update your `fit_generator` call to the Keras 2 API: `fit_generator(<keras_pre..., validation_data=<keras_pre..., shuffle=True, validation_steps=2145, steps_per_epoch=297, epochs=50)`\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/50\n",
"297/297 [==============================] - 48s 163ms/step - loss: 1.2736 - acc: 0.5804 - val_loss: 0.2559 - val_acc: 0.9184\n",
"Epoch 2/50\n",
"297/297 [==============================] - 43s 144ms/step - loss: 0.6782 - acc: 0.8110 - val_loss: 0.1568 - val_acc: 0.9501\n",
"Epoch 3/50\n",
"297/297 [==============================] - 43s 145ms/step - loss: 0.4985 - acc: 0.8668 - val_loss: 0.1359 - val_acc: 0.9562\n",
"Epoch 4/50\n",
"297/297 [==============================] - 42s 142ms/step - loss: 0.4170 - acc: 0.8855 - val_loss: 0.0842 - val_acc: 0.9762\n",
"Epoch 5/50\n",
"297/297 [==============================] - 43s 144ms/step - loss: 0.3868 - acc: 0.8931 - val_loss: 0.0797 - val_acc: 0.9753\n",
"Epoch 6/50\n",
"297/297 [==============================] - 42s 142ms/step - loss: 0.3590 - acc: 0.8977 - val_loss: 0.0694 - val_acc: 0.9800\n",
"Epoch 7/50\n",
"297/297 [==============================] - 42s 142ms/step - loss: 0.3032 - acc: 0.9131 - val_loss: 0.0979 - val_acc: 0.9744\n",
"Epoch 8/50\n",
"297/297 [==============================] - 42s 141ms/step - loss: 0.2855 - acc: 0.9215 - val_loss: 0.0934 - val_acc: 0.9776\n",
"Epoch 9/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.2884 - acc: 0.9139 - val_loss: 0.0804 - val_acc: 0.9786\n",
"Epoch 10/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.2766 - acc: 0.9205 - val_loss: 0.0845 - val_acc: 0.9781\n",
"Epoch 11/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.2634 - acc: 0.9223 - val_loss: 0.0678 - val_acc: 0.9832\n",
"Epoch 12/50\n",
"297/297 [==============================] - 42s 141ms/step - loss: 0.2394 - acc: 0.9278 - val_loss: 0.0828 - val_acc: 0.9800\n",
"Epoch 13/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.2330 - acc: 0.9278 - val_loss: 0.0918 - val_acc: 0.9776\n",
"Epoch 14/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.2211 - acc: 0.9329 - val_loss: 0.0638 - val_acc: 0.9837\n",
"Epoch 15/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.2201 - acc: 0.9314 - val_loss: 0.0763 - val_acc: 0.9809\n",
"Epoch 16/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.2236 - acc: 0.9314 - val_loss: 0.0723 - val_acc: 0.9828\n",
"Epoch 17/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.2073 - acc: 0.9354 - val_loss: 0.0781 - val_acc: 0.9823\n",
"Epoch 18/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.2104 - acc: 0.9335 - val_loss: 0.0731 - val_acc: 0.9828\n",
"Epoch 19/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.2066 - acc: 0.9388 - val_loss: 0.0755 - val_acc: 0.9818\n",
"Epoch 20/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.1898 - acc: 0.9442 - val_loss: 0.0707 - val_acc: 0.9823\n",
"Epoch 21/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1987 - acc: 0.9367 - val_loss: 0.0824 - val_acc: 0.9814\n",
"Epoch 22/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1885 - acc: 0.9386 - val_loss: 0.0792 - val_acc: 0.9823\n",
"Epoch 23/50\n",
"297/297 [==============================] - 42s 141ms/step - loss: 0.1887 - acc: 0.9398 - val_loss: 0.0688 - val_acc: 0.9837\n",
"Epoch 24/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1640 - acc: 0.9520 - val_loss: 0.0718 - val_acc: 0.9837\n",
"Epoch 25/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.1703 - acc: 0.9482 - val_loss: 0.0757 - val_acc: 0.9814\n",
"Epoch 26/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.1708 - acc: 0.9476 - val_loss: 0.0801 - val_acc: 0.9823\n",
"Epoch 27/50\n",
"297/297 [==============================] - 42s 143ms/step - loss: 0.1607 - acc: 0.9468 - val_loss: 0.0757 - val_acc: 0.9828\n",
"Epoch 28/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.1672 - acc: 0.9474 - val_loss: 0.1142 - val_acc: 0.9772\n",
"Epoch 29/50\n",
"297/297 [==============================] - 41s 138ms/step - loss: 0.1690 - acc: 0.9470 - val_loss: 0.0686 - val_acc: 0.9841\n",
"Epoch 30/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.1596 - acc: 0.9527 - val_loss: 0.0991 - val_acc: 0.9790\n",
"Epoch 31/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.1565 - acc: 0.9531 - val_loss: 0.0807 - val_acc: 0.9828\n",
"Epoch 32/50\n",
"297/297 [==============================] - 42s 141ms/step - loss: 0.1623 - acc: 0.9466 - val_loss: 0.0642 - val_acc: 0.9874\n",
"Epoch 33/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1557 - acc: 0.9505 - val_loss: 0.0723 - val_acc: 0.9832\n",
"Epoch 34/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1431 - acc: 0.9564 - val_loss: 0.0843 - val_acc: 0.9809\n",
"Epoch 35/50\n",
"297/297 [==============================] - 41s 138ms/step - loss: 0.1564 - acc: 0.9497 - val_loss: 0.0707 - val_acc: 0.9837\n",
"Epoch 36/50\n",
"297/297 [==============================] - 42s 142ms/step - loss: 0.1516 - acc: 0.9516 - val_loss: 0.0723 - val_acc: 0.9837\n",
"Epoch 37/50\n",
"297/297 [==============================] - 42s 141ms/step - loss: 0.1477 - acc: 0.9529 - val_loss: 0.0811 - val_acc: 0.9828\n",
"Epoch 38/50\n",
"297/297 [==============================] - 42s 141ms/step - loss: 0.1484 - acc: 0.9529 - val_loss: 0.0754 - val_acc: 0.9832\n",
"Epoch 39/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.1472 - acc: 0.9543 - val_loss: 0.0745 - val_acc: 0.9818\n",
"Epoch 40/50\n",
"297/297 [==============================] - 42s 141ms/step - loss: 0.1353 - acc: 0.9583 - val_loss: 0.0737 - val_acc: 0.9837\n",
"Epoch 41/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.1549 - acc: 0.9489 - val_loss: 0.0793 - val_acc: 0.9823\n",
"Epoch 42/50\n",
"297/297 [==============================] - 41s 140ms/step - loss: 0.1351 - acc: 0.9569 - val_loss: 0.0623 - val_acc: 0.9855\n",
"Epoch 43/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1247 - acc: 0.9585 - val_loss: 0.0717 - val_acc: 0.9832\n",
"Epoch 44/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.1462 - acc: 0.9529 - val_loss: 0.0753 - val_acc: 0.9814\n",
"Epoch 45/50\n",
"297/297 [==============================] - 41s 138ms/step - loss: 0.1480 - acc: 0.9529 - val_loss: 0.0673 - val_acc: 0.9841\n",
"Epoch 46/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1277 - acc: 0.9598 - val_loss: 0.0742 - val_acc: 0.9814\n",
"Epoch 47/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.1245 - acc: 0.9596 - val_loss: 0.0657 - val_acc: 0.9846\n",
"Epoch 48/50\n",
"297/297 [==============================] - 42s 140ms/step - loss: 0.1437 - acc: 0.9554 - val_loss: 0.0728 - val_acc: 0.9832\n",
"Epoch 49/50\n",
"297/297 [==============================] - 41s 138ms/step - loss: 0.1436 - acc: 0.9562 - val_loss: 0.0696 - val_acc: 0.9823\n",
"Epoch 50/50\n",
"297/297 [==============================] - 41s 139ms/step - loss: 0.1231 - acc: 0.9611 - val_loss: 0.0627 - val_acc: 0.9832\n"
]
},
{
"data": {
"text/plain": [
"<keras.callbacks.History at 0x7fc78bcf6128>"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from keras import applications\n",
"from keras.preprocessing.image import ImageDataGenerator\n",
"from keras import optimizers\n",
"from keras.models import Sequential\n",
"from keras.layers import Dropout, Flatten, Dense\n",
"\n",
"# path to the model weights files.\n",
"#weights_path = '../keras/examples/vgg16_weights.h5'\n",
"#top_model_weights_path = 'fc_model.h5'\n",
"# dimensions of our images.\n",
"img_width, img_height = 139, 139\n",
"\n",
"train_data_dir = './natural_images'\n",
"validation_data_dir = './natural_images_validation'\n",
"nb_train_samples = 4754\n",
"nb_validation_samples = 2145\n",
"epochs = 50\n",
"batch_size = 16\n",
"\n",
"# build the VGG16 network\n",
"model = Sequential()\n",
"gmodel = applications.InceptionV3(weights='imagenet', include_top=False, input_shape = (139,139,3))\n",
"for layer in gmodel.layers:\n",
" layer.trainable = False\n",
"print('Model loaded.')\n",
"print(\"InceptionV3 Model Summary:\")\n",
"print(gmodel.summary())\n",
"# print(gmodel.summary())\n",
"# build a classifier model to put on top of the convolutional model\n",
"top_model = Sequential()\n",
"top_model.add(Flatten(input_shape=(3,3,2048)))\n",
"top_model.add(Dense(256, activation='relu'))\n",
"top_model.add(Dropout(0.5))\n",
"top_model.add(Dense(8, activation='softmax'))\n",
"\n",
"# note that it is necessary to start with a fully-trained\n",
"# classifier, including the top classifier,\n",
"# in order to successfully do fine-tuning\n",
"#top_model.load_weights(top_model_weights_path)\n",
"\n",
"# add the model on top of the convolutional base\n",
"model.add(gmodel)\n",
"model.add(top_model)\n",
"\n",
"# set the first 25 layers (up to the last conv block)\n",
"# to non-trainable (weights will not be updated)\n",
"print(\"Final Model Summary:\")\n",
"print(model.summary())\n",
"# compile the model with a SGD/momentum optimizer\n",
"# and a very slow learning rate.\n",
"model.compile(loss='categorical_crossentropy',\n",
" optimizer=optimizers.SGD(lr=1e-4, momentum=0.9),\n",
" metrics=['accuracy'])\n",
"\n",
"# prepare data augmentation configuration\n",
"train_datagen = ImageDataGenerator(\n",
" rescale=1. / 255,\n",
" shear_range=0.2,\n",
" zoom_range=0.2,\n",
" horizontal_flip=True)\n",
"\n",
"test_datagen = ImageDataGenerator(rescale=1. / 255)\n",
"\n",
"train_generator = train_datagen.flow_from_directory(\n",
" train_data_dir,\n",
" target_size=(img_height, img_width),\n",
" batch_size=batch_size,\n",
" class_mode='categorical')\n",
"\n",
"validation_generator = test_datagen.flow_from_directory(\n",
" validation_data_dir,\n",
" target_size=(img_height, img_width),\n",
" batch_size=batch_size,\n",
" class_mode='categorical')\n",
"\n",
"# fine-tune the model\n",
"model.fit_generator(\n",
" train_generator,\n",
" samples_per_epoch=nb_train_samples,\n",
" epochs=epochs,\n",
" validation_data=validation_generator,\n",
"nb_val_samples=nb_validation_samples,shuffle=True)"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Saved model to disk\n"
]
}
],
"source": [
"model_json = model.to_json()\n",
"with open(\"model_pretrained.json\", \"w\") as json_file:\n",
" json_file.write(model_json)\n",
"model.save_weights(\"model_pretrained.h5\")\n",
"print(\"Saved model to disk\")"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Loaded model from disk\n",
"2145/2145 [==============================] - 127s 59ms/step\n"
]
}
],
"source": [
"from keras.models import model_from_json\n",
"json_file = open('model_pretrained.json', 'r')\n",
"loaded_model_json = json_file.read()\n",
"json_file.close()\n",
"model = model_from_json(loaded_model_json)\n",
"# load weights into new model\n",
"model.load_weights(\"model_pretrained.h5\")\n",
"print(\"Loaded model from disk\")\n",
"#one_hot_labels = keras.utils.to_categorical(y_train, num_classes=8)\n",
"#one_hot_test = keras.utils.to_categorical(y_test, num_classes=8)\n",
"# evaluate loaded model on test data\n",
"model.compile(loss='categorical_crossentropy',\n",
" optimizer=optimizers.SGD(lr=1e-4, momentum=0.9),\n",
" metrics=['accuracy'])\n",
"#score = model.evaluate_generator(datagen.flow(X_test, one_hot_test,batch_size=1), verbose=1)\n",
"score = model.evaluate_generator(validation_generator, steps=nb_validation_samples, verbose=1)"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"acc: 98.32%\n"
]
}
],
"source": [
"print(\"%s: %.2f%%\" % (model.metrics_names[1], score[1]*100))"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment