Skip to content

Instantly share code, notes, and snippets.

@mmeendez8
Last active January 22, 2019 10:57
Show Gist options
  • Select an option

  • Save mmeendez8/6d6adddf91922f7613cfcbb2e7479b4a to your computer and use it in GitHub Desktop.

Select an option

Save mmeendez8/6d6adddf91922f7613cfcbb2e7479b4a to your computer and use it in GitHub Desktop.
def encoder(X):
activation = tf.nn.relu
with tf.variable_scope("Encoder"):
x = tf.layers.conv2d(X, filters=64, kernel_size=4, strides=2, padding='same', activation=activation)
x = tf.layers.conv2d(x, filters=64, kernel_size=4, strides=2, padding='same', activation=activation)
x = tf.layers.conv2d(x, filters=64, kernel_size=4, strides=1, padding='same', activation=activation)
x = tf.layers.flatten(x)
# Local latent variables
mean_ = tf.layers.dense(x, units=FLAGS.latent_dim, name='mean')
std_dev = tf.nn.softplus(tf.layers.dense(x, units=FLAGS.latent_dim), name='std_dev') # softplus to force >0
# Reparametrization trick
epsilon = tf.random_normal(tf.stack([tf.shape(x)[0], FLAGS.latent_dim]), name='epsilon')
z = mean_ + tf.multiply(epsilon, std_dev)
return z, mean_, std_dev
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment