Skip to content

Instantly share code, notes, and snippets.

@mzaradzki
Last active April 4, 2017 08:09
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mzaradzki/f347677151678cc7ae37c8383b657f97 to your computer and use it in GitHub Desktop.
Save mzaradzki/f347677151678cc7ae37c8383b657f97 to your computer and use it in GitHub Desktop.
CBOW word embedding in Keras
modelWRD = Sequential()
# 1st layer is a dummy-permutation=identity to specify input shape
modelWRD.add( Permute((1,), input_shape=(n_words,)) )
modelWRD.add( EMBEDDING )
modelWRD.add( Lambda(
lambda x : K.sum(x,axis=1), # sum over words
output_shape=(embedding_dimension,))
)
# Dense is a linear map followed by an activation
modelWRD.add( Dense(vocab_size, activation='softmax') )
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment