This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import sys | |
types = set() | |
token_count = 0 | |
for i, line in enumerate(sys.stdin): | |
if i % 1000 == 0: | |
print('.') | |
line = line.strip().split() | |
types.update(line) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
Because pytorch does not expose the internal activations of a module, | |
we must instead rerun the same exact function inside that module. | |
This is written specifically for a 1 layer LSTM with all default settings. | |
""" | |
import torch | |
import torch.nn as nn | |
from torch.autograd import Variable |
OlderNewer