Skip to content

Instantly share code, notes, and snippets.

myke MikulasZelinka

  • Prague
Block or report user

Report or block MikulasZelinka

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
MikulasZelinka /
Last active Jul 31, 2019
pytorch: handling sentences of arbitrary length (dataset, data_loader, padding, embedding, packing, lstm, unpacking)
sort-of minimal end-to-end example of handling input sequences (sentences) of variable length in pytorch
the sequences are considered to be sentences of words, meaning we then want to use embeddings and an RNN
using pytorch stuff for basically everything in the pipeline of:
dataset -> data_loader -> padding -> embedding -> packing -> lstm -> unpacking (~padding)
based mostly on:
pytorch version 1.1.0
MikulasZelinka /
Created Sep 20, 2017
Infinitely many people gather to wrap their heads around a coin-flipping game
# Game: n people, each person flips a coin until they get heads
# Question: what is the ratio of heads after the game ends?
n = 1024
heads = 0
tails = 0
# how many rounds does each game last on average (just for fun):
iterations = 0
You can’t perform that action at this time.