Skip to content

Instantly share code, notes, and snippets.

@sjebbara
Created September 21, 2016 12:56
Show Gist options
  • Save sjebbara/620d9ac4fc389a3f454444ec76c764f3 to your computer and use it in GitHub Desktop.
Save sjebbara/620d9ac4fc389a3f454444ec76c764f3 to your computer and use it in GitHub Desktop.
Keras: Build Error Using TimeDistributed Recurrent Layer with Dropout
import numpy
from keras.models import Model
from keras.layers import Input, Embedding, GRU, TimeDistributed
batch_size = 2
n_sequences = 5
n_elements_per_sequence = 7
element_size = 12
dropout_W = 0.5
dropout_U = 0
sequence_of_sequences_input = Input(batch_shape=(batch_size, None, None, element_size),
name='sequence_of_sequences_input')
rnn_layer = GRU(20, return_sequences=False, dropout_W=dropout_W, dropout_U=dropout_U)
sequence_embeddings = TimeDistributed(rnn_layer, name="sequence_embeddings")(sequence_of_sequences_input)
model = Model(sequence_of_sequences_input, sequence_embeddings)
model.compile("adam", "mse")
model._make_predict_function()
X = numpy.random.rand(batch_size, n_sequences, n_elements_per_sequence, element_size)
Y = model.predict_on_batch(X)
@sjebbara
Copy link
Author

model._make_predict_function() throwing an error when setting either dropout_W or dropout_U to a value > 0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment