Skip to content

Instantly share code, notes, and snippets.

@csukuangfj
Last active January 9, 2019 12:22
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save csukuangfj/52dc3962682e0b2b003f56feb45088f9 to your computer and use it in GitHub Desktop.
Save csukuangfj/52dc3962682e0b2b003f56feb45088f9 to your computer and use it in GitHub Desktop.
keras-notes.md

Table of Contents

Input Shape (2D)

  • num_samples x feature_dim
  • there are num_samples rows and feature_dim columns
num_samples = 100
feature_dim = 8
data = np.random.random((num_samples, feature_dim))

Weights Shape (FC layer)

  • number of rows: equal to the input dimension
  • number of columns: equal to the number of output units

Linear Regression

from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD

import numpy as np

w = 10
b = -2 

x_train = np.random.random((1000, 1))
y_train = w * x_train + b

x_test = np.random.random((20, 1))
y_test = w * x_test + b


model = Sequential()
model.add(Dense(1, input_dim=1))

sgd = SGD(lr=0.0001, momentum=0.9, nesterov=True)
model.compile(loss='mse', optimizer=sgd, metrics=['mse'])

model.fit(x_train, y_train, epochs=200, batch_size=10)
score = model.evaluate(x_test, y_test, batch_size=10)

Set Weights Manually

sw = np.array([[10]], dtype=np.float32)
sb = np.array([-2], dtype=np.float32)
model.set_weights([sw, sb])

Perform Predication

model.predict([1]) # single sample
model.predict([1, 2]) # two samples

Save and Load Model

There are two methods: (1) save the model definition and trained weights separately; (2) save the model definition and trained weights in a signle file.

Use Two different files

Save the model definition into a txt file

with open('model_definition.json', 'w') as f:
    f.write(model.to_json())

Saved file is (view it from here):

cat model_definition.json
{"class_name": "Sequential", "config": {"name": "sequential_6", "layers": [{"class_name": "Dense", "config": {"name": "dense_6", "trainable": true, "batch_input_shape": [null, 1], "dtype": "float32", "units": 1, "activation": "linear", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}]}, "keras_version": "2.2.4", "backend": "tensorflow"}

Next, save the trained weights

model.save_weights('model_weights.h5')

View the saved weights

$ ls -lh model_weights.h5
-rw-r--r--  1 xxx  xxx    10K Jan  8 16:57 model_weights.h5
$ strings model_weights.h5
TREE
HEAP
dense_6
layer_names
dense_6
backend
keras_version
TREE
HEAP
dense_6
weight_names
dense_6/kernel:0dense_6/bias:0
GCOL
tensorflow
2.2.4
SNOD
TREE
HEAP
kernel:0
bias:0
SNOD
SNOD

Now load the model from two files:

from keras.models import model_from_json
with open('model_definition.json', 'r') as f:
    model2 = model_from_json(f.read())
model2.load_weights('model_weights.h5')
print(model2.get_weights())
model2.predict([3])

Output:

[array([[10.]], dtype=float32), array([-2.], dtype=float32)]
array([[28.]], dtype=float32)

Use One file

Save the model

model.save('model.h5')

Load the model

from keras.models import load_model
model3 = load_model('model.h5')
print(model3.get_weights())
model3.predict([3])

Output:

[array([[10.]], dtype=float32), array([-2.], dtype=float32)]
array([[28.]], dtype=float32)

View the saved model

$ strings model.h5
TREE
HEAP
model_weights
optimizer_weights
keras_version
backend
model_config
TREE
HEAP
dense_6
layer_names
dense_6
backend
GCOL
2.2.4
tensorflow
{"class_name": "Sequential", "config": {"name": "sequential_6", "layers": [{"class_name": "Dense", "config": {"name": "dense_6", "trainable": true, "batch_input_shape": [null, 1], "dtype": "float32", "units": 1, "activation": "linear", "use_bias": true, "kernel_initializer": {"class_name": "VarianceScaling", "config": {"scale": 1.0, "mode": "fan_avg", "distribution": "uniform", "seed": null}}, "bias_initializer": {"class_name": "Zeros", "config": {}}, "kernel_regularizer": null, "bias_regularizer": null, "activity_regularizer": null, "kernel_constraint": null, "bias_constraint": null}}]}}
tensorflow
2.2.4
{"optimizer_config": {"class_name": "SGD", "config": {"lr": 9.999999747378752e-05, "momentum": 0.8999999761581421, "decay": 0.0, "nesterov": true}}, "loss": "mse", "metrics": ["mse"], "sample_weight_mode": null, "loss_weights": null}
SNOD
keras_version
TREE
HEAP
dense_6
SNOD
weight_names
dense_6/kernel:0dense_6/bias:0
TREE
HEAP
kernel:0
bias:0
SNOD
SNOD
training_config
TREE
HEAP
SGD_5
training_5
weight_names
SGD_5/iterations:0
training_5/SGD/Variable:0
training_5/SGD/Variable_1:0
TREE
HEAP
iterations:0
SNOD
SNOD
TREE
HEAP
TREE
HEAP
Variable:0
Variable_1:0
SNOD
SNOD
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment