Skip to content

Instantly share code, notes, and snippets.

@apaszke
Last active January 16, 2023 07:20
Embed
What would you like to do?
def Rop(y, x, v):
"""Computes an Rop.
Arguments:
y (Variable): output of differentiated function
x (Variable): differentiated input
v (Variable): vector to be multiplied with Jacobian from the right
"""
w = torch.ones_like(y, requires_grad=True)
return torch.autograd.grad(torch.autograd.grad(y, x, w), w, v)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment