Skip to content

Instantly share code, notes, and snippets.

@giuseppebonaccorso
Last active September 30, 2019 01:48
Show Gist options
  • Star 6 You must be signed in to star a gist
  • Fork 3 You must be signed in to fork a gist
  • Save giuseppebonaccorso/e77e505fc7b61983f7b42dc1250f31c8 to your computer and use it in GitHub Desktop.
Save giuseppebonaccorso/e77e505fc7b61983f7b42dc1250f31c8 to your computer and use it in GitHub Desktop.
CIFAR-10 image classification with Keras ConvNet
'''
Cifar-10 classification
Original dataset and info: https://www.cs.toronto.edu/~kriz/cifar.html for more information
See: https://www.bonaccorso.eu/2016/08/06/cifar-10-image-classification-with-keras-convnet/ for further information
'''
from __future__ import print_function
import numpy as np
from keras.callbacks import EarlyStopping
from keras.datasets import cifar10
from keras.models import Sequential
from keras.layers.core import Dense, Dropout, Flatten
from keras.layers.convolutional import Conv2D
from keras.optimizers import Adam
from keras.layers.pooling import MaxPooling2D
from keras.utils import to_categorical
# For reproducibility
np.random.seed(1000)
if __name__ == '__main__':
# Load the dataset
(X_train, Y_train), (X_test, Y_test) = cifar10.load_data()
# Create the model
model = Sequential()
model.add(Conv2D(32, kernel_size=(3, 3), activation='relu', input_shape=(32, 32, 3)))
model.add(Conv2D(64, kernel_size=(3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Conv2D(128, kernel_size=(3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Conv2D(128, kernel_size=(3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(1024, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(10, activation='softmax'))
# Compile the model
model.compile(loss='categorical_crossentropy',
optimizer=Adam(lr=0.0001, decay=1e-6),
metrics=['accuracy'])
# Train the model
model.fit(X_train / 255.0, to_categorical(Y_train),
batch_size=128,
shuffle=True,
epochs=250,
validation_data=(X_test / 255.0, to_categorical(Y_test)),
callbacks=[EarlyStopping(min_delta=0.001, patience=3)])
# Evaluate the model
scores = model.evaluate(X_test / 255.0, to_categorical(Y_test))
print('Loss: %.3f' % scores[0])
print('Accuracy: %.3f' % scores[1])
50000/50000 [==============================] - 8s - loss: 2.0241 - acc: 0.2407 - val_loss: 1.7554 - val_acc: 0.3679
Epoch 2/250
50000/50000 [==============================] - 7s - loss: 1.6900 - acc: 0.3752 - val_loss: 1.5634 - val_acc: 0.4311
Epoch 3/250
50000/50000 [==============================] - 7s - loss: 1.5477 - acc: 0.4333 - val_loss: 1.4581 - val_acc: 0.4715
Epoch 4/250
50000/50000 [==============================] - 7s - loss: 1.4600 - acc: 0.4669 - val_loss: 1.3661 - val_acc: 0.5080
Epoch 5/250
50000/50000 [==============================] - 7s - loss: 1.3886 - acc: 0.4982 - val_loss: 1.2941 - val_acc: 0.5327
Epoch 6/250
50000/50000 [==============================] - 7s - loss: 1.3239 - acc: 0.5236 - val_loss: 1.2299 - val_acc: 0.5627
Epoch 7/250
50000/50000 [==============================] - 7s - loss: 1.2725 - acc: 0.5465 - val_loss: 1.1857 - val_acc: 0.5819
Epoch 8/250
50000/50000 [==============================] - 7s - loss: 1.2253 - acc: 0.5637 - val_loss: 1.1405 - val_acc: 0.5905
Epoch 9/250
50000/50000 [==============================] - 7s - loss: 1.1831 - acc: 0.5786 - val_loss: 1.1158 - val_acc: 0.6049
Epoch 10/250
50000/50000 [==============================] - 7s - loss: 1.1481 - acc: 0.5952 - val_loss: 1.0691 - val_acc: 0.6213
Epoch 11/250
50000/50000 [==============================] - 7s - loss: 1.1144 - acc: 0.6087 - val_loss: 1.0336 - val_acc: 0.6366
Epoch 12/250
50000/50000 [==============================] - 7s - loss: 1.0888 - acc: 0.6159 - val_loss: 1.0219 - val_acc: 0.6435
Epoch 13/250
50000/50000 [==============================] - 7s - loss: 1.0599 - acc: 0.6283 - val_loss: 1.0057 - val_acc: 0.6466
Epoch 14/250
50000/50000 [==============================] - 7s - loss: 1.0313 - acc: 0.6366 - val_loss: 0.9747 - val_acc: 0.6621
Epoch 15/250
50000/50000 [==============================] - 7s - loss: 1.0126 - acc: 0.6446 - val_loss: 0.9838 - val_acc: 0.6588
Epoch 16/250
50000/50000 [==============================] - 7s - loss: 0.9848 - acc: 0.6547 - val_loss: 0.9212 - val_acc: 0.6774
Epoch 17/250
50000/50000 [==============================] - 7s - loss: 0.9672 - acc: 0.6603 - val_loss: 0.9165 - val_acc: 0.6821
Epoch 18/250
50000/50000 [==============================] - 7s - loss: 0.9410 - acc: 0.6705 - val_loss: 0.8926 - val_acc: 0.6912
Epoch 19/250
50000/50000 [==============================] - 7s - loss: 0.9233 - acc: 0.6781 - val_loss: 0.8909 - val_acc: 0.6908
Epoch 20/250
50000/50000 [==============================] - 7s - loss: 0.9028 - acc: 0.6842 - val_loss: 0.8558 - val_acc: 0.7051
Epoch 21/250
50000/50000 [==============================] - 7s - loss: 0.8851 - acc: 0.6901 - val_loss: 0.8618 - val_acc: 0.7018
Epoch 22/250
50000/50000 [==============================] - 7s - loss: 0.8701 - acc: 0.6956 - val_loss: 0.8321 - val_acc: 0.7131
Epoch 23/250
50000/50000 [==============================] - 7s - loss: 0.8508 - acc: 0.7048 - val_loss: 0.8248 - val_acc: 0.7145
Epoch 24/250
50000/50000 [==============================] - 7s - loss: 0.8369 - acc: 0.7087 - val_loss: 0.8083 - val_acc: 0.7220
Epoch 25/250
50000/50000 [==============================] - 7s - loss: 0.8258 - acc: 0.7114 - val_loss: 0.8023 - val_acc: 0.7191
Epoch 26/250
50000/50000 [==============================] - 7s - loss: 0.8102 - acc: 0.7176 - val_loss: 0.7914 - val_acc: 0.7256
Epoch 27/250
50000/50000 [==============================] - 7s - loss: 0.7910 - acc: 0.7230 - val_loss: 0.7717 - val_acc: 0.7354
Epoch 28/250
50000/50000 [==============================] - 7s - loss: 0.7835 - acc: 0.7260 - val_loss: 0.7682 - val_acc: 0.7349
Epoch 29/250
50000/50000 [==============================] - 7s - loss: 0.7716 - acc: 0.7311 - val_loss: 0.7557 - val_acc: 0.7371
Epoch 30/250
50000/50000 [==============================] - 7s - loss: 0.7568 - acc: 0.7364 - val_loss: 0.7483 - val_acc: 0.7409
Epoch 31/250
50000/50000 [==============================] - 7s - loss: 0.7458 - acc: 0.7390 - val_loss: 0.7527 - val_acc: 0.7382
Epoch 32/250
50000/50000 [==============================] - 7s - loss: 0.7334 - acc: 0.7444 - val_loss: 0.7391 - val_acc: 0.7439
Epoch 33/250
50000/50000 [==============================] - 7s - loss: 0.7293 - acc: 0.7463 - val_loss: 0.7523 - val_acc: 0.7387
Epoch 34/250
50000/50000 [==============================] - 7s - loss: 0.7122 - acc: 0.7509 - val_loss: 0.7234 - val_acc: 0.7494
Epoch 35/250
50000/50000 [==============================] - 7s - loss: 0.7039 - acc: 0.7525 - val_loss: 0.7079 - val_acc: 0.7533
Epoch 36/250
50000/50000 [==============================] - 7s - loss: 0.6925 - acc: 0.7588 - val_loss: 0.7177 - val_acc: 0.7535
Epoch 37/250
50000/50000 [==============================] - 7s - loss: 0.6829 - acc: 0.7606 - val_loss: 0.6987 - val_acc: 0.7598
Epoch 38/250
50000/50000 [==============================] - 7s - loss: 0.6721 - acc: 0.7659 - val_loss: 0.6984 - val_acc: 0.7597
Epoch 39/250
50000/50000 [==============================] - 7s - loss: 0.6618 - acc: 0.7682 - val_loss: 0.6875 - val_acc: 0.7604
Epoch 40/250
50000/50000 [==============================] - 7s - loss: 0.6526 - acc: 0.7718 - val_loss: 0.6852 - val_acc: 0.7627
Epoch 41/250
50000/50000 [==============================] - 7s - loss: 0.6416 - acc: 0.7767 - val_loss: 0.6901 - val_acc: 0.7634
Epoch 42/250
50000/50000 [==============================] - 7s - loss: 0.6389 - acc: 0.7774 - val_loss: 0.6787 - val_acc: 0.7647
Epoch 43/250
50000/50000 [==============================] - 7s - loss: 0.6283 - acc: 0.7785 - val_loss: 0.6790 - val_acc: 0.7649
Epoch 44/250
50000/50000 [==============================] - 7s - loss: 0.6124 - acc: 0.7856 - val_loss: 0.6798 - val_acc: 0.7646
Epoch 45/250
50000/50000 [==============================] - 7s - loss: 0.6108 - acc: 0.7858 - val_loss: 0.6736 - val_acc: 0.7693
Epoch 46/250
50000/50000 [==============================] - 7s - loss: 0.5985 - acc: 0.7896 - val_loss: 0.6578 - val_acc: 0.7730
Epoch 47/250
50000/50000 [==============================] - 7s - loss: 0.5935 - acc: 0.7923 - val_loss: 0.6538 - val_acc: 0.7755
Epoch 48/250
50000/50000 [==============================] - 7s - loss: 0.5862 - acc: 0.7945 - val_loss: 0.6751 - val_acc: 0.7702
Epoch 49/250
50000/50000 [==============================] - 7s - loss: 0.5746 - acc: 0.7991 - val_loss: 0.6498 - val_acc: 0.7738
Epoch 50/250
50000/50000 [==============================] - 7s - loss: 0.5692 - acc: 0.8009 - val_loss: 0.6494 - val_acc: 0.7759
Epoch 51/250
50000/50000 [==============================] - 7s - loss: 0.5613 - acc: 0.8027 - val_loss: 0.6485 - val_acc: 0.7788
Epoch 52/250
50000/50000 [==============================] - 7s - loss: 0.5557 - acc: 0.8043 - val_loss: 0.6430 - val_acc: 0.7765
Epoch 53/250
50000/50000 [==============================] - 7s - loss: 0.5470 - acc: 0.8084 - val_loss: 0.6363 - val_acc: 0.7801
Epoch 54/250
50000/50000 [==============================] - 7s - loss: 0.5387 - acc: 0.8140 - val_loss: 0.6695 - val_acc: 0.7729
Epoch 55/250
50000/50000 [==============================] - 7s - loss: 0.5333 - acc: 0.8135 - val_loss: 0.6472 - val_acc: 0.7789
Epoch 56/250
50000/50000 [==============================] - 7s - loss: 0.5274 - acc: 0.8138 - val_loss: 0.6358 - val_acc: 0.7832
Epoch 57/250
50000/50000 [==============================] - 7s - loss: 0.5209 - acc: 0.8178 - val_loss: 0.6276 - val_acc: 0.7859
Epoch 58/250
50000/50000 [==============================] - 7s - loss: 0.5113 - acc: 0.8218 - val_loss: 0.6442 - val_acc: 0.7811
Epoch 59/250
50000/50000 [==============================] - 7s - loss: 0.5046 - acc: 0.8241 - val_loss: 0.6353 - val_acc: 0.7851
Epoch 60/250
50000/50000 [==============================] - 7s - loss: 0.5040 - acc: 0.8238 - val_loss: 0.6239 - val_acc: 0.7900
Epoch 61/250
50000/50000 [==============================] - 7s - loss: 0.4921 - acc: 0.8261 - val_loss: 0.6271 - val_acc: 0.7886
Epoch 62/250
50000/50000 [==============================] - 7s - loss: 0.4921 - acc: 0.8258 - val_loss: 0.6158 - val_acc: 0.7905
Epoch 63/250
50000/50000 [==============================] - 7s - loss: 0.4781 - acc: 0.8308 - val_loss: 0.6162 - val_acc: 0.7900
Epoch 64/250
50000/50000 [==============================] - 7s - loss: 0.4794 - acc: 0.8316 - val_loss: 0.6333 - val_acc: 0.7847
Epoch 65/250
50000/50000 [==============================] - 7s - loss: 0.4691 - acc: 0.8357 - val_loss: 0.6230 - val_acc: 0.7883
Epoch 66/250
50000/50000 [==============================] - 7s - loss: 0.4597 - acc: 0.8383 - val_loss: 0.6217 - val_acc: 0.7900
9408/10000 [===========================>..] - ETA: 0s
Loss: 0.622
Accuracy: 0.790
@mrr00b00t
Copy link

which gpu you used?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment