Skip to content

Instantly share code, notes, and snippets.

@devnag
devnag / gist:ebeabae8542294f1c3a0e2f202da4360
Created February 21, 2017 15:50
PyTorch's superimposed gradients
import torch
import torch.nn as nn
from torch.autograd import Variable
import torch.optim as optim
a = torch.ones(1,2)
b = torch.nn.Linear(2,1)
b.zero_grad()
c = b(Variable(a))
@devnag
devnag / gist:5aea1b240dba463781aa15b6ac5a79bb
Created February 20, 2017 20:56
Changes to make gan_pytorch.py use only squared diffs rather than squared diffs + original data
diff --git a/gan_pytorch.py b/gan_pytorch.py
index 0bff38c..802d6cb 100755
--- a/gan_pytorch.py
+++ b/gan_pytorch.py
@@ -31,7 +31,7 @@ g_steps = 1
# ### Uncomment only one of these
#(name, preprocess, d_input_func) = ("Raw data", lambda data: data, lambda x: x)
-(name, preprocess, d_input_func) = ("Data and variances", lambda data: decorate_with_diffs(data, 2.0), lambda x: x * 2)
+(name, preprocess, d_input_func) = ("Data and variances", lambda data: decorate_with_diffs(data, 2.0), lambda x: x)