Skip to content

Instantly share code, notes, and snippets.

@MaximumEntropy
Last active July 23, 2018 12:46
Show Gist options
  • Save MaximumEntropy/918d4ad7c931bc14b475008c00aa09f1 to your computer and use it in GitHub Desktop.
Save MaximumEntropy/918d4ad7c931bc14b475008c00aa09f1 to your computer and use it in GitHub Desktop.
Padded RNN PyTorch
import torch
import torch.nn as nn
from torch.autograd import Variable
from torch.nn.utils.rnn import pad_packed_sequence, pack_padded_sequence
x = Variable(torch.randn(10, 20, 30)).cuda()
lens = range(10)
x = pack_padded_sequence(x, lens[::-1], batch_first=True)
lstm = nn.LSTM(30, 50, batch_first=True).cuda()
h0 = Variable(torch.zeros(1, 10, 50)).cuda()
c0 = Variable(torch.zeros(1, 10, 50)).cuda()
packed_h, (packed_h_t, packed_c_t) = lstm(x, (h0, c0))
h, _ = pad_packed_sequence(packed_h)
print h.size() # Size 20 x 10 x 50 instead of 10 x 20 x 50
@hunkim
Copy link

hunkim commented Nov 2, 2017

Good example.

lens = range(10) -> lens = range(1,11)?

@YCmove
Copy link

YCmove commented Jul 23, 2018

@hunkim you are right !
lens = range(10) will raise ValueError: Length of all samples has to be greater than 0, but found an element in 'lengths' that is <= 0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment