Skip to content

Instantly share code, notes, and snippets.

@LimHyungTae
Created January 15, 2020 19:58
Show Gist options
  • Save LimHyungTae/c46ff1ea8dafa3d170847572594b56eb to your computer and use it in GitHub Desktop.
Save LimHyungTae/c46ff1ea8dafa3d170847572594b56eb to your computer and use it in GitHub Desktop.
Cross Entropy
import torch
import torch.nn as nn
from torch.autograd import Variable
import torch.nn.functional as F
num_classes = 5
input = torch.randn(3, num_classes, requires_grad=True)
print(input)
target = torch.randint(num_classes, (3,), dtype=torch.int64)
print(target)
softmax = nn.Softmax(dim=1)
softmaxed = softmax(input)
print(softmaxed)
loss = F.cross_entropy(softmaxed, target, reduction='none')
print(loss)
input2d = torch.randn(3, num_classes, 7, 7, requires_grad=True)
target2d = torch.randint(num_classes, (3, 7, 7), dtype=torch.int64)
print(input2d)
print(target2d)
softmax = nn.Softmax(dim=2)
softmaxed = softmax(input2d)
loss = F.cross_entropy(softmaxed, target2d)
print(loss)
@LimHyungTae
Copy link
Author

Note that
input2d.size() -> torch.Size([3, 5, 7, 7])
target2d.size() -> torch.Size([3, 7, 7])

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment