Skip to content

Instantly share code, notes, and snippets.

@amankharwal
Created October 5, 2020 03:52
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save amankharwal/f5f17c32b05b3ad7fa06793cd086c622 to your computer and use it in GitHub Desktop.
Save amankharwal/f5f17c32b05b3ad7fa06793cd086c622 to your computer and use it in GitHub Desktop.
def create_model(max_sequence_len, total_words):
input_len = max_sequence_len — 1
model = Sequential()
# Add Input Embedding Layer
model.add(Embedding(total_words, 10, input_length=input_len))
# Add Hidden Layer 1 — LSTM Layer
model.add(LSTM(100))
model.add(Dropout(0.1))
# Add Output Layer
model.add(Dense(total_words, activation=’softmax’))
model.compile(loss=’categorical_crossentropy’, optimizer=’adam’)
return model
model = create_model(max_sequence_len, total_words)
model.fit(predictors, label, epochs=20, verbose=5)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment