Skip to content

Instantly share code, notes, and snippets.

@syhw
syhw / itc
Created September 3, 2019 04:40
#!/bin/bash
# tmux requires unrecognized OSC sequences to be wrapped with DCS tmux;
# <sequence> ST, and for all ESCs in <sequence> to be replaced with ESC ESC. It
# only accepts ESC backslash for ST.
function print_osc() {
if [[ -n $TERM ]] ; then
printf "\033Ptmux;\033\033]"
else
printf "\033]"
@syhw
syhw / (FL)OPS
Created January 12, 2019 16:33
_theoretical_ FLOPS (counting FMA = 2 OPS)
category | model | FLOPS | HPOPS | AVX / cores / cuda cores...
CPU | i9-8950HK (MacBookPro) | 278 GFLOPS | 556 GOPS? | AVX2, 6 cores
CPU | E5-2698 v4 | 704 GFLOPS | 1.4 TOPS? | AVX2, 20 cores
CPU | Xeon Plat. 8160 (~GCP/AWS) | 3.2 TFLOPS | 6.4 TOPS? | AVX512 x2, 24 cores (2.1*24*512/32*2*2)
CPU | Threadripper 2990WX | 1.5 TFLOPS | 3 TOPS? | AVX2, 32 cores (3*32*256/32*2)
GPU | K80 | 8.7 TFLOPS | | 2496 cuda cores x2 (=4992)
GPU | 1080 Ti | 10.6 TFLOPS | | 3584:224:88
GPU | AMD RX Vega 64 | 11.5 TFLOPS | 23 TOPS | 4096:256:64
GPU | M60 | 9.6 TFLOPS | | 2048:128:64 x2 (i.e. x2 GPU dies)
GPU | P100 (NVLink) | 10.6 TFLOPS | 21 TOPS | 3584:224:88?
@syhw
syhw / gist:cf6644c7fb73b02a0131
Created December 16, 2015 18:01
<your moba here> (DOTA 2) heroes embedding
5v5 matches
number of heroes in the pool = K
dimension of the embedding = E
- encode a hero as a one-hot of heroes = 1-of-K
- learn a (K, E) matrix to go from hero -> vector (+ bias)
(notice that it can do set-of-heroes -> vector too)
- learn a logistic regression from both the embeddings of team1 and team2 to predict the winner by backprop through the embedding.
- do stats and t-SNE plots of embeddings of single heroes or combinations (teams) of heroes
- ...
- PROFIT!!!
@syhw
syhw / bear.pl
Last active November 17, 2015 00:49
#!/usr/bin/perl
use strict;
use warnings;
use Time::HiRes qw/usleep/;
my $b; &load_bear;
sub rd {
print "\n\e[17A";
}
@syhw
syhw / 3 points
Last active November 16, 2015 16:54
Pourquoi un message M porté par une personne X est différent du message M porté par une personne Y:
0) Tout message M que tu peux partager en un article est incomplet! Il se base forcément sur un "sens commun", et des connaissances de
base en commun (note que pour les personnes X et Y, ses sous-entendus sont sans doute différents). Sinon ce message contiendrait toute
l'information pour recréer et la situation et le raisonnement. Donc ce qui se passe dans la tête de l'auditoire est I=f(M), avec f
dépendent du lecteur.
1) Déjà une bonne preuve (au sens mathématique) qui part des mauvais postulats, c'est très difficile à détecter, parce que l'on passe
plus de temps à regarder le raisonnement que les faits de départ. => Donc je me méfie des bonnes rhétoriques dans la bouche de gens qui
par ailleurs ont déjà menti.
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared
@syhw
syhw / bandits.py
Created September 1, 2014 09:04
Bandits problem solved with naive algorithms, epsilon-greedy bandit, UCB1, and Bayesian bandits.
import numpy as np
from scipy.stats import bernoulli
N_EXPS = 200 # number of experiences to conduct TODO test 10k or 20k EXPS
N_BAGS = 10 # number of bags
N_DRAWS = 100 # number of draws
SMOOTHER = 1.E-2 # shrinkage parameter
EPSILON_BANDIT = 0.1 # epsilon-greedy bandit epsilon
EPSILON_NUM = 1.E-9 # numerical epsilon
@syhw
syhw / dnn.py
Last active January 24, 2024 19:38
A simple deep neural network with or w/o dropout in one file.
"""
A deep neural network with or w/o dropout in one file.
License: Do What The Fuck You Want to Public License http://www.wtfpl.net/
"""
import numpy, theano, sys, math
from theano import tensor as T
from theano import shared
from theano.tensor.shared_randomstreams import RandomStreams
@syhw
syhw / dnn_compare_optims.py
Created July 30, 2014 07:46
DNN in one file, with dropout(s) and SAG
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared
@syhw
syhw / dnn_compare_optims.py
Created July 21, 2014 09:07
comparing SGD vs SAG vs Adadelta vs Adagrad
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared