Skip to content

Instantly share code, notes, and snippets.

@prafulgondane
Last active June 20, 2022 17:06
Show Gist options
  • Save prafulgondane/16ec84bc7a444bb76eda9c28241e3ff7 to your computer and use it in GitHub Desktop.
Save prafulgondane/16ec84bc7a444bb76eda9c28241e3ff7 to your computer and use it in GitHub Desktop.
DROPOUT_RATE = 0.3
LEARNING_RATE = 0.00005
NUM_EPOCHS = 10
BATCH_SIZE = 128
sequence_input = Input(shape=(MAX_SEQUENCE_LENGTH,), dtype='int32')
embedding_layer = Embedding(len(tokenizer.word_index) + 1,
EMBEDDINGS_DIMENSION,
weights=[embedding_matrix],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)
x = embedding_layer(sequence_input)
x = Bidirectional(LSTM(150, return_sequences = True)) (x)
x = Dropout(0.2) (x)
x = Bidirectional(LSTM(100)) (x)
x = Dense(128, activation='relu')(x)
preds = Dense(2, activation='softmax')(x)
# Compile model.
print('compiling model')
# model = Model(input_layer, output_layer)
model = Model(sequence_input, preds)
model.compile(loss='categorical_crossentropy', optimizer=RMSprop(lr=LEARNING_RATE), metrics=['acc'])
keras.utils.plot_model(model, "multi_input_and_output_model.png", show_shapes=True)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment