Skip to content

Instantly share code, notes, and snippets.

@gchhablani
Last active May 17, 2020 07:14
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save gchhablani/0fb64d9e006bb9aee66c9ba86d85c60e to your computer and use it in GitHub Desktop.
Save gchhablani/0fb64d9e006bb9aee66c9ba86d85c60e to your computer and use it in GitHub Desktop.
Training AutoEncoder in PyTorch
ae = AE()
ae.to(device)
criterion = nn.MSELoss()
optimizer = optim.Adamax(ae.parameters(),lr = 1e-4)
l = None
for epoch in range(100):
for i, data in enumerate(loader,0):
inputs,classes = data
inputs,classes = Variable(inputs.resize_(batch_size,784)).to(device),Variable(classes).to(device)
optimizer.zero_grad()
out = ae(inputs)
loss = criterion(out,inputs)
loss.backward()
optimizer.step()
l = loss.item()
print(epoch,l)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment