Skip to content

Instantly share code, notes, and snippets.

@yshean
Created December 17, 2015 12:49
Show Gist options
  • Save yshean/dfdb87b0badf20429cac to your computer and use it in GitHub Desktop.
Save yshean/dfdb87b0badf20429cac to your computer and use it in GitHub Desktop.
Keras with Theano backend: This is a working example of autoencoder training on MNIST dataset.
n_in = 784
n_hid = 392
n_out = n_in
batch_size = 64
nb_epoch = 10
learning_rate = 1e-3
from keras.datasets import mnist
import numpy as np
(X_train, y_train), (X_test, y_test) = mnist.load_data()
X_train = X_train.reshape(X_train.shape[0], -1)
X_test = X_test.reshape(X_test.shape[0], -1)
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
X_train /= 255
X_test /= 255
from keras.layers.core import Dense, AutoEncoder
from keras.models import Sequential
from keras.optimizers import RMSprop
encoder = Dense(input_dim=n_in, output_dim=n_hid, activation='sigmoid')
decoder = Dense(input_dim=n_hid, output_dim=n_out, activation='sigmoid')
model = Sequential()
model.add(AutoEncoder(encoder=encoder, decoder=decoder, output_reconstruction=True))
rmsprop = RMSprop(lr=learning_rate)
model.compile(loss='mean_squared_error', optimizer=rmsprop)
model.fit(X_train, X_train, batch_size=batch_size, nb_epoch=nb_epoch, verbose=1)
@yshean
Copy link
Author

yshean commented Dec 17, 2015

Example output:

Using Theano backend.
Using gpu device 0: GeForce GTX TITAN X (CNMeM is disabled)
Epoch 1/10
60000/60000 [==============================] - 2s - loss: 0.0755      
Epoch 2/10
60000/60000 [==============================] - 2s - loss: 0.0651     
Epoch 3/10
60000/60000 [==============================] - 2s - loss: 0.0616     
Epoch 4/10
60000/60000 [==============================] - 2s - loss: 0.0574     
Epoch 5/10
60000/60000 [==============================] - 2s - loss: 0.0532     
Epoch 6/10
60000/60000 [==============================] - 2s - loss: 0.0496     
Epoch 7/10
60000/60000 [==============================] - 2s - loss: 0.0465      
Epoch 8/10
60000/60000 [==============================] - 2s - loss: 0.0440     
Epoch 9/10
60000/60000 [==============================] - 2s - loss: 0.0419     
Epoch 10/10
60000/60000 [==============================] - 2s - loss: 0.0401

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment