Skip to content

Instantly share code, notes, and snippets.

@a-z-e-r-i-l-a
a-z-e-r-i-l-a / Pytorch Jacobian-Hessian
Last active Jun 8, 2019
I am looking for the most efficient way to get the Jacobian of a function through Pytorch and have so far come up with the following solutions. Since there seem to be not a big difference between using a loop in the first solution than the second one, I wanted to ask if there might still be be a faster way to calculate a Jacobian in pytorch.
View Pytorch Jacobian-Hessian
def func(X):
return torch.stack((
X.pow(2).sum(1),
J = torch.zeros(3, int(1e5))
for i in range(3):
J[i] = grad(Y[0][i], X, create_graph=True, retain_graph=True, allow_unused=True)[0]
print(time()-t)
Output: 0.002 s
# Output: 0.002 s