Skip to content

Instantly share code, notes, and snippets.

@ceshine
Last active September 30, 2017 02:12
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ceshine/b015f2c587126f73e0f5afd2cd05e58b to your computer and use it in GitHub Desktop.
Save ceshine/b015f2c587126f73e0f5afd2cd05e58b to your computer and use it in GitHub Desktop.
Key Code Blocks of Keras LSTM Dropout Implementation
# https://github.com/tensorflow/tensorflow/blob/v1.3.0/tensorflow/contrib/keras/python/keras/layers/recurrent.py#L1163
class LSTM(Recurrent):
#...
def get_constants(self, inputs, training=None):
#...
input_shape = K.int_shape(inputs)
input_dim = input_shape[-1]
ones = K.ones_like(K.reshape(inputs[:, 0, 0], (-1, 1)))
ones = K.tile(ones, (1, int(input_dim)))
def dropped_inputs():
return K.dropout(ones, self.dropout)
dp_mask = [
K.in_train_phase(dropped_inputs, ones, training=training)
for _ in range(4)
]
#...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment