Skip to content

Instantly share code, notes, and snippets.

@sjebbara
sjebbara / keras_distributed_rnn_dropout_error.py
Created September 21, 2016 12:56
Keras: Build Error Using TimeDistributed Recurrent Layer with Dropout
import numpy
from keras.models import Model
from keras.layers import Input, Embedding, GRU, TimeDistributed
batch_size = 2
n_sequences = 5
n_elements_per_sequence = 7
element_size = 12
dropout_W = 0.5
@sjebbara
sjebbara / keras_distributed_dropout_error.py
Created October 24, 2016 16:06
MissingInputError in Keras: Distributing any layer that uses the learning phase throws an exception
from keras.models import Model
from keras.layers import Input, TimeDistributed, Dropout
in1 = Input(batch_shape=(10, 8, 6), name="in1")
# The following Dropout() layer is closely related to the error,
# although the error is thrown at the model._make_predict_function()
# Exchanging Dropout() with other layers that use the learning phase (e.g. GaussianNoise())
# causes the same issues.
# Other layer, like Dense() work perfectly.
out1 = TimeDistributed(Dropout(0.5))(in1)