Skip to content

Instantly share code, notes, and snippets.

@hadifar
Last active January 5, 2019 20:31
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save hadifar/f75a76c656416dc10e77fb4d44ef47e4 to your computer and use it in GitHub Desktop.
Save hadifar/f75a76c656416dc10e77fb4d44ef47e4 to your computer and use it in GitHub Desktop.
import tensorflow as tf
tf.enable_eager_execution()
X = tf.constant([[1., 2.], [3., 4.]])
y = tf.constant([[1.], [2.]])
w = tf.get_variable(name='w', shape=[2, 1], initializer=tf.constant_initializer([[1.], [2.]]))
b = tf.get_variable(name='b', shape=[1], initializer=tf.constant_initializer([[1.]]))
with tf.GradientTape() as tape:
C = 0.5 * tf.reduce_sum(tf.square(tf.matmul(X, w) + b - y))
w_grad, b_grad = tape.gradient(C, [w, b])
print(C.numpy())
print(w_grad.numpy())
print(b_grad.numpy())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment