Skip to content

Instantly share code, notes, and snippets.

@ikhlestov
Created September 13, 2017 19:02
Show Gist options
  • Save ikhlestov/85a950d9dc8e334165e8b46bd9f40a94 to your computer and use it in GitHub Desktop.
Save ikhlestov/85a950d9dc8e334165e8b46bd9f40a94 to your computer and use it in GitHub Desktop.
pytorch: excluding subgraphs from backward
import torch
from torch.autograd import Variable
# requires grad
# If there’s a single input to an operation that requires gradient,
# its output will also require gradient.
x = Variable(torch.randn(5, 5))
y = Variable(torch.randn(5, 5))
z = Variable(torch.randn(5, 5), requires_grad=True)
a = x + y
a.requires_grad # False
b = a + z
b.requires_grad # True
# Volatile differs from requires_grad in how the flag propagates.
# If there’s even a single volatile input to an operation,
# its output is also going to be volatile.
x = Variable(torch.randn(5, 5), requires_grad=True)
y = Variable(torch.randn(5, 5), volatile=True)
a = x + y
a.requires_grad # False
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment