Skip to content

Instantly share code, notes, and snippets.

@mortonjt
mortonjt / adasoft.py
Created July 21, 2021 03:42 — forked from rosinality/adasoft.py
Adaptive Softmax implementation for PyTorch
import torch
from torch import nn
from torch.autograd import Variable
class AdaptiveSoftmax(nn.Module):
def __init__(self, input_size, cutoff):
super().__init__()
self.input_size = input_size
self.cutoff = cutoff