Skip to content

Instantly share code, notes, and snippets.

@ZER-0-NE
Created July 3, 2019 16:32
Show Gist options
  • Save ZER-0-NE/07bf82d391ded859d551d1799c5d7d8f to your computer and use it in GitHub Desktop.
Save ZER-0-NE/07bf82d391ded859d551d1799c5d7d8f to your computer and use it in GitHub Desktop.
Image Dimension: 200x300
img_width, img_height = 200, 300
*********************************************************
if K.image_data_format() == 'channels_first':
input_shape = (3, img_width, img_height)
else:
input_shape = (img_width, img_height, 3)
model = Sequential()
model.add(Conv2D(128, (7, 7), padding = 'same', input_shape=input_shape))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.5))
model.add(Conv2D(128, (7, 7), padding = 'same'))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.6))
model.add(Conv2D(128, (7, 7), padding = 'same'))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.6))
model.add(Conv2D(128, (7, 7), padding = 'same'))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.6))
model.add(Conv2D(64, (7, 7), padding = 'same'))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.6))
model.add(Conv2D(64, (7, 7), padding = 'same'))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.6))
model.add(Conv2D(32, (7, 7), padding = 'same'))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.6))
model.add(Flatten())
model.add(Dense(32))
model.add(Activation('relu'))
model.add(Dense(1))
model.add(Activation('sigmoid'))
model.compile(loss='binary_crossentropy',
optimizer=optimizers.Adam(lr=3e-5),
metrics=['accuracy'])
model.summary()
*************************************************************************************
Found 29124 images belonging to 2 classes.
Found 10401 images belonging to 2 classes.
Epoch 1/80
910/910 [==============================] - 1097s 1s/step - loss: 0.6984 - acc: 0.4969 - val_loss: 0.6932 - val_acc: 0.4079
Epoch 2/80
910/910 [==============================] - 1100s 1s/step - loss: 0.6729 - acc: 0.5485 - val_loss: 0.5663 - val_acc: 0.8185
Epoch 3/80
910/910 [==============================] - 1100s 1s/step - loss: 0.4114 - acc: 0.8307 - val_loss: 0.3781 - val_acc: 0.8797
Epoch 4/80
910/910 [==============================] - 1090s 1s/step - loss: 0.3269 - acc: 0.8748 - val_loss: 0.3166 - val_acc: 0.9001
Epoch 5/80
910/910 [==============================] - 1087s 1s/step - loss: 0.2686 - acc: 0.8986 - val_loss: 0.3041 - val_acc: 0.9242
Epoch 6/80
910/910 [==============================] - 1091s 1s/step - loss: 0.2255 - acc: 0.9170 - val_loss: 0.2521 - val_acc: 0.9178
Epoch 7/80
910/910 [==============================] - 1095s 1s/step - loss: 0.1977 - acc: 0.9269 - val_loss: 0.2162 - val_acc: 0.9240
Epoch 8/80
910/910 [==============================] - 1069s 1s/step - loss: 0.1819 - acc: 0.9340 - val_loss: 0.1801 - val_acc: 0.9437
Epoch 9/80
910/910 [==============================] - 1169s 1s/step - loss: 0.1614 - acc: 0.9411 - val_loss: 0.1908 - val_acc: 0.9247
Epoch 10/80
910/910 [==============================] - 1093s 1s/step - loss: 0.1627 - acc: 0.9409 - val_loss: 0.1616 - val_acc: 0.9450
Epoch 11/80
910/910 [==============================] - 1060s 1s/step - loss: 0.1539 - acc: 0.9448 - val_loss: 0.2095 - val_acc: 0.9243
Epoch 12/80
910/910 [==============================] - 1061s 1s/step - loss: 0.1451 - acc: 0.9459 - val_loss: 0.1874 - val_acc: 0.9326
Epoch 13/80
910/910 [==============================] - 1060s 1s/step - loss: 0.1435 - acc: 0.9469 - val_loss: 0.2133 - val_acc: 0.9078
Epoch 14/80
910/910 [==============================] - 1059s 1s/step - loss: 0.1394 - acc: 0.9487 - val_loss: 0.1902 - val_acc: 0.9246
Epoch 15/80
910/910 [==============================] - 1055s 1s/step - loss: 0.1366 - acc: 0.9495 - val_loss: 0.1446 - val_acc: 0.9514
Epoch 16/80
910/910 [==============================] - 1055s 1s/step - loss: 0.1347 - acc: 0.9504 - val_loss: 0.1568 - val_acc: 0.9500
Epoch 17/80
910/910 [==============================] - 1057s 1s/step - loss: 0.1312 - acc: 0.9498 - val_loss: 0.1629 - val_acc: 0.9384
Epoch 18/80
910/910 [==============================] - 1055s 1s/step - loss: 0.1260 - acc: 0.9525 - val_loss: 0.1408 - val_acc: 0.9488
Epoch 19/80
910/910 [==============================] - 1056s 1s/step - loss: 0.1285 - acc: 0.9523 - val_loss: 0.1470 - val_acc: 0.9514
Epoch 20/80
910/910 [==============================] - 1054s 1s/step - loss: 0.1288 - acc: 0.9519 - val_loss: 0.1350 - val_acc: 0.9543
Epoch 21/80
910/910 [==============================] - 1055s 1s/step - loss: 0.1244 - acc: 0.9534 - val_loss: 0.1558 - val_acc: 0.9554
Epoch 22/80
910/910 [==============================] - 1050s 1s/step - loss: 0.1244 - acc: 0.9534 - val_loss: 0.1395 - val_acc: 0.9556
Epoch 23/80
910/910 [==============================] - 1050s 1s/step - loss: 0.1247 - acc: 0.9531 - val_loss: 0.1473 - val_acc: 0.9570
Epoch 24/80
910/910 [==============================] - 1052s 1s/step - loss: 0.1210 - acc: 0.9533 - val_loss: 0.1368 - val_acc: 0.9571
Epoch 25/80
910/910 [==============================] - 1051s 1s/step - loss: 0.1170 - acc: 0.9567 - val_loss: 0.1323 - val_acc: 0.9579
Epoch 26/80
910/910 [==============================] - 1052s 1s/step - loss: 0.1186 - acc: 0.9558 - val_loss: 0.1269 - val_acc: 0.9552
Epoch 27/80
910/910 [==============================] - 1056s 1s/step - loss: 0.1159 - acc: 0.9567 - val_loss: 0.1297 - val_acc: 0.9580
Epoch 28/80
910/910 [==============================] - 1058s 1s/step - loss: 0.1164 - acc: 0.9553 - val_loss: 0.1429 - val_acc: 0.9582
Epoch 29/80
910/910 [==============================] - 1051s 1s/step - loss: 0.1136 - acc: 0.9572 - val_loss: 0.1315 - val_acc: 0.9596
Epoch 30/80
910/910 [==============================] - 1060s 1s/step - loss: 0.1120 - acc: 0.9575 - val_loss: 0.1315 - val_acc: 0.9592
Epoch 31/80
910/910 [==============================] - 1050s 1s/step - loss: 0.1117 - acc: 0.9581 - val_loss: 0.1240 - val_acc: 0.9586
Epoch 32/80
910/910 [==============================] - 1055s 1s/step - loss: 0.1105 - acc: 0.9582 - val_loss: 0.1276 - val_acc: 0.9594
Epoch 33/80
910/910 [==============================] - 1054s 1s/step - loss: 0.1081 - acc: 0.9594 - val_loss: 0.1181 - val_acc: 0.9584
Epoch 34/80
910/910 [==============================] - 1051s 1s/step - loss: 0.1098 - acc: 0.9590 - val_loss: 0.1225 - val_acc: 0.9591
Epoch 35/80
910/910 [==============================] - 1054s 1s/step - loss: 0.1084 - acc: 0.9589 - val_loss: 0.1420 - val_acc: 0.9553
Epoch 36/80
910/910 [==============================] - 1053s 1s/step - loss: 0.1087 - acc: 0.9596 - val_loss: 0.1208 - val_acc: 0.9587
Epoch 37/80
910/910 [==============================] - 1054s 1s/step - loss: 0.1057 - acc: 0.9592 - val_loss: 0.1160 - val_acc: 0.9579
Epoch 38/80
910/910 [==============================] - 1052s 1s/step - loss: 0.1065 - acc: 0.9600 - val_loss: 0.1169 - val_acc: 0.9564
Epoch 39/80
910/910 [==============================] - 1049s 1s/step - loss: 0.1037 - acc: 0.9605 - val_loss: 0.1232 - val_acc: 0.9603
Epoch 40/80
910/910 [==============================] - 1052s 1s/step - loss: 0.1051 - acc: 0.9595 - val_loss: 0.1268 - val_acc: 0.9590
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment