Skip to content

Instantly share code, notes, and snippets.

View datduong's full-sized avatar
🏠
Working from home

Dat Duong datduong

🏠
Working from home
View GitHub Profile
@Tushar-N
Tushar-N / pad_packed_demo.py
Last active December 27, 2022 06:35
How to use pad_packed_sequence in pytorch<1.1.0
import torch
import torch.nn as nn
from torch.nn.utils.rnn import pack_padded_sequence, pad_packed_sequence
seqs = ['gigantic_string','tiny_str','medium_str']
# make <pad> idx 0
vocab = ['<pad>'] + sorted(set(''.join(seqs)))
# make model
@bartolsthoorn
bartolsthoorn / multilabel_example.py
Created April 29, 2017 12:13
Simple multi-laber classification example with Pytorch and MultiLabelSoftMarginLoss (https://en.wikipedia.org/wiki/Multi-label_classification)
import torch
import torch.nn as nn
import numpy as np
import torch.optim as optim
from torch.autograd import Variable
# (1, 0) => target labels 0+2
# (0, 1) => target labels 1
# (1, 1) => target labels 3
train = []
@kylemcdonald
kylemcdonald / _tsne.pdf
Last active February 22, 2024 22:13
Exploring antonyms with word2vec.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@j-v
j-v / fixmyindent.vim
Created September 16, 2012 16:11
Fix my screwed up Python indentation in vim
" initially i was using 3-space (shiftwidth=3) indentation with noexpandtab, and tabstop=8, the default
" this switches it to proper 4-space indentation with expandtab
" convert tabs to spaces
set tabstop=8
set shiftwidth=3
set expandtab
retab
" convert spaces to tabs
i
me
my
myself
we
our
ours
ourselves
you
your