Skip to content

Instantly share code, notes, and snippets.

@ashokc
Created January 26, 2019 18:05
Show Gist options
  • Save ashokc/70bc2921f5456736e7dcbea5480d3236 to your computer and use it in GitHub Desktop.
Save ashokc/70bc2921f5456736e7dcbea5480d3236 to your computer and use it in GitHub Desktop.
Padded sequences from Keras
# Turn text into 200-long integer sequences, padding with 0 if necessary to maintain the length at 200
import keras
sequenceLength = 200
kTokenizer = keras.preprocessing.text.Tokenizer()
kTokenizer.fit_on_texts(X)
encoded_docs = kTokenizer.texts_to_sequences(X)
Xencoded = keras.preprocessing.sequence.pad_sequences(encoded_docs, maxlen=sequenceLength, padding='post')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment