Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

@Orbifold
Created September 1, 2018 12:49
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Orbifold/b8f70cd84c453350ead39a3381a1cb6d to your computer and use it in GitHub Desktop.
Save Orbifold/b8f70cd84c453350ead39a3381a1cb6d to your computer and use it in GitHub Desktop.
Pytorch hello world
	import torch


	batch_size = 32
	input_shape = 5
	output_shape = 10


	from torch.autograd import Variable
	X = Variable(torch.randn(batch_size, input_shape))
	y = Variable(torch.randn(batch_size, output_shape), requires_grad=False)


	model = torch.nn.Sequential(
	 torch.nn.Linear(input_shape, 32),
	 torch.nn.Linear(32, output_shape),
	 )


	loss_function = torch.nn.MSELoss()


	learning_rate = 0.001
	for i in range(10):
	    y_pred = model(X)
	    loss = loss_function(y_pred, y)
	    print(loss.data[0])
	    # Zero gradients
	    model.zero_grad()
	    loss.backward()
	    # Update weights
	    for param in model.parameters():
	        param.data -= learning_rate * param.grad.data

tensor(1.1245)
tensor(1.1242)
tensor(1.1238)
tensor(1.1235)
tensor(1.1231)
tensor(1.1228)
tensor(1.1225)
tensor(1.1221)
tensor(1.1218)
tensor(1.1215)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment