Skip to content

Instantly share code, notes, and snippets.

@ceshine
Created September 30, 2017 02:11
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ceshine/fa4160f0483cf4063296e9b5e80917fc to your computer and use it in GitHub Desktop.
Save ceshine/fa4160f0483cf4063296e9b5e80917fc to your computer and use it in GitHub Desktop.
Key Code Blocks of Keras LSTM Dropout Implementation
# https://github.com/tensorflow/tensorflow/blob/v1.3.0/tensorflow/contrib/keras/python/keras/layers/recurrent.py#L1197
class LSTM(Recurrent):
#...
def step(self, inputs, states):
if self.implementation == 2:
z = K.dot(inputs * dp_mask[0], self.kernel)
z += K.dot(h_tm1 * rec_dp_mask[0], self.recurrent_kernel)
if self.use_bias:
z = K.bias_add(z, self.bias)
#...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment