Skip to content

Instantly share code, notes, and snippets.

@DuckSoft
Created July 11, 2018 06:06
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save DuckSoft/8d251871aab29689aca23e41da133886 to your computer and use it in GitHub Desktop.
Save DuckSoft/8d251871aab29689aca23e41da133886 to your computer and use it in GitHub Desktop.
A Simple Softmax Classifier Demo using PyTorch
import numpy as np
import pandas as pd
import torch
from torch.autograd import Variable
model = torch.nn.Sequential(
torch.nn.Linear(3,3, bias=True),
torch.nn.ReLU(),
torch.nn.Linear(3,3, bias=True),
torch.nn.ReLU(),
torch.nn.Linear(3,3, bias=True),
torch.nn.ReLU(),
torch.nn.Softmax(dim=1)
)
print(model)
data = pd.read_csv("data.csv")
data_x = np.array(data[["plastic","paper","glass"]], dtype=np.float32)
data_y = np.array(data[["student","worker","elder"]], dtype=np.float32)
x_train = torch.from_numpy(data_x)
y_train = torch.from_numpy(data_y)
num_epoch = 1000
loss_function = torch.nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(), lr=1e-2, momentum=0.9)
for epoch in range(num_epoch):
input = Variable(x_train)
target = Variable(y_train)
# forward
out = model(input)
loss = loss_function(out, target)
# backward
optimizer.zero_grad()
loss.backward()
optimizer.step()
# show
print('Epoch[{}/{}], loss: {:.6f}'
.format(epoch + 1, num_epoch, loss.data.item()))
# predicting
print(model(torch.tensor([[500, 500, 500]], dtype=torch.float32)))
plastic paper glass student worker elder
100 0 0 1 0 0
50 50 50 0 0 1
30 0 90 0 1 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment