Skip to content

Instantly share code, notes, and snippets.

@MohanaRC
Created November 7, 2022 13:46
Show Gist options
  • Save MohanaRC/9aa1fc1cabb04c692fc0a998e646a188 to your computer and use it in GitHub Desktop.
Save MohanaRC/9aa1fc1cabb04c692fc0a998e646a188 to your computer and use it in GitHub Desktop.
def tf_gradient_tape_no_persistent(x):
"""
Simple implementation to understand the functioning of gradient tape for chain rule and return intermediate values without
setting persistent parameter
Inputs:
x: Tensor value
Returns:
EagerTensor: Derivative of y with respect to input tensor x and derivative of z with respect to input tensor x
"""
with tf.GradientTape(persistent=True) as t:
t.watch(x)
y=x*x ## Defining y(x)=x**2
z=y*y ## Defining z(y)=y**2
# Calculating the derivative of y with respect to x
dy_dx=t.gradient(y,x)
# Calculating the derivative of z with respect to x
dz_dx = t.gradient(z, x)
return dy_dx,dz_dx
# Run the function for x=5
tmp_x = tf.constant(3.0)
dy_dx,dz_dx = tf_gradient_no_persistent(tmp_x)
result_zx = dz_dx.numpy()
result_yx= dy_dx.numpy()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment