Skip to content

Instantly share code, notes, and snippets.

@WillKoehrsen
Created October 1, 2018 01:15
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save WillKoehrsen/f4320ab152f879b0572d8f89d8300ad5 to your computer and use it in GitHub Desktop.
Save WillKoehrsen/f4320ab152f879b0572d8f89d8300ad5 to your computer and use it in GitHub Desktop.
# Both inputs are 1-dimensional
book = Input(name = 'book', shape = [1])
link = Input(name = 'link', shape = [1])
# Embedding the book (shape will be (None, 1, 50))
book_embedding = Embedding(name = 'book_embedding',
input_dim = len(book_index),
output_dim = embedding_size)(book)
# Embedding the link (shape will be (None, 1, 50))
link_embedding = Embedding(name = 'link_embedding',
input_dim = len(link_index),
output_dim = embedding_size)(link)
# Merge the layers with a dot product along the second axis (shape will be (None, 1, 1))
merged = Dot(name = 'dot_product', normalize = True, axes = 2)([book_embedding, link_embedding])
# Reshape to be a single number (shape will be (None, 1))
merged = Reshape(target_shape = [1])(merged)
# Output neuron
out = Dense(1, activation = 'sigmoid')(merged)
model = Model(inputs = [book, link], outputs = out)
# Minimize binary cross entropy
model.compile(optimizer = 'Adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
@msamogh
Copy link

msamogh commented Aug 21, 2019

Why is the shape of book and link equal to 1? Shouldn't it be like a sequence or something?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment