Skip to content

Instantly share code, notes, and snippets.

@prabindh
Created May 13, 2019 14:45
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save prabindh/e0a90a13bc7c576ebd18785399d1fa9e to your computer and use it in GitHub Desktop.
Save prabindh/e0a90a13bc7c576ebd18785399d1fa9e to your computer and use it in GitHub Desktop.
lambda
from keras.datasets import mnist
from keras.layers import Dense, Input, concatenate,subtract, Lambda
from keras.losses import binary_crossentropy
from keras.optimizers import SGD
(train_x, train_y), (test_x, test_y) = mnist.load_data()
train_x = (train_x / 255.0).reshape(-1, 28*28)
test_x = (test_x / 255.0).reshape(-1, 28*28)
inp1 = Input(shape=(28*28,))
inp2 = Input(shape=(28*28,))
l1 = Dense(100, activation="relu")
a1 = l1(inp1)
a2 = l1(inp2)
l2 = Dense(100, activation="relu")
b1 = l2(a1)
b2 = l2(a2)
output = Lambda(lambda inputs: inputs[0]-inputs[1], output_shape=(100,))([b1, b2])
model = Model([inp1, inp2], [output])
model.compile(loss=binary_crossentropy, optimizer=SGD())
y = train_y[:1]
model.fit([train_x[:1], train_x[:1]], y)
@prabindh
Copy link
Author

If your targets are one-hot encoded, use categorical_crossentropy. if targets are int, use sparse_categ_xentropy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment