Skip to content

Instantly share code, notes, and snippets.

@devnag
Created February 21, 2017 15:50
Show Gist options
  • Save devnag/ebeabae8542294f1c3a0e2f202da4360 to your computer and use it in GitHub Desktop.
Save devnag/ebeabae8542294f1c3a0e2f202da4360 to your computer and use it in GitHub Desktop.
PyTorch's superimposed gradients
import torch
import torch.nn as nn
from torch.autograd import Variable
import torch.optim as optim
a = torch.ones(1,2)
b = torch.nn.Linear(2,1)
b.zero_grad()
c = b(Variable(a))
c.backward()
print(list(b.parameters())[0].grad) # Pair of 1's
c = b(Variable(a))
c.backward()
print(list(b.parameters())[0].grad) # Pair of 2's; gradients were added
b.zero_grad()
print(list(b.parameters())[0].grad) # Pair of 0's; gradients were reset
c = b(Variable(a))
c.backward()
print(list(b.parameters())[0].grad) # Pair of 1's again
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment