Skip to content

Instantly share code, notes, and snippets.

@lettergram
Last active January 3, 2019 22:18
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save lettergram/6126ceeeb9948dde26b08228cf10f5d6 to your computer and use it in GitHub Desktop.
Save lettergram/6126ceeeb9948dde26b08228cf10f5d6 to your computer and use it in GitHub Desktop.
import keras
from keras.preprocessing import sequence
from keras.models import Sequential
from keras.layers import Dense, Embedding,GlobalAveragePooling1D
model = Sequential()
# Created Embedding (Input) Layer (max_words) --> Pooling Layer
model.add(Embedding(max_words, embedding_dims, input_length=maxlen))
# Create the average Pooling Layer
model.add(GlobalAveragePooling1D())
# Create the output layer (num_classes)
model.add(Dense(num_classes, activation='softmax'))
# Add optimization method, loss function and optimization value
model.compile(loss='categorical_crossentropy',
optimizer='adam', metrics=['accuracy'])
# "Fit the model" (train model), using training data (80% of datset)
model.fit(x_train, y_train, batch_size=batch_size,
epochs=epochs, validation_data=(x_test, y_test))
# Evaluate the trained model, using the test data (20% of the dataset)
score = model.evaluate(x_test, y_test, batch_size=batch_size)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment