Skip to content

Instantly share code, notes, and snippets.

@syhw
syhw / dnn.py
Last active January 24, 2024 19:38
A simple deep neural network with or w/o dropout in one file.
"""
A deep neural network with or w/o dropout in one file.
License: Do What The Fuck You Want to Public License http://www.wtfpl.net/
"""
import numpy, theano, sys, math
from theano import tensor as T
from theano import shared
from theano.tensor.shared_randomstreams import RandomStreams
@syhw
syhw / naive_Bayes_Gibbs.py
Created April 24, 2014 08:30
Very old implementation of mine of "Gibbs Sampling for the Uninitiated" (Philip Resnik, Eric Hardisty)
# -*- coding: utf-8 -*-
import os, re, random, math
from collections import Counter
""" Naive Bayes with Gibbs sampling, so it can deal with unlabeled data """
# as from "Gibbs Sampling for the Uninitiated" Philip Resnik, Eric Hardisty
def Dirichlet(v):
""" takes a vector of counts v and returns a Multinomial ~ Dirichlet(v) """
y = []
@syhw
syhw / gist:5128969
Created March 10, 2013 15:20
A collapsed Gibbs sampler for Dirichlet process Gaussian mixture models.
# -*- coding: utf-8 -*-
import itertools, random
import numpy as np
from scipy import linalg
import pylab as pl
import matplotlib as mpl
import math
epsilon = 10e-8
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared
@syhw
syhw / bandits.py
Created September 1, 2014 09:04
Bandits problem solved with naive algorithms, epsilon-greedy bandit, UCB1, and Bayesian bandits.
import numpy as np
from scipy.stats import bernoulli
N_EXPS = 200 # number of experiences to conduct TODO test 10k or 20k EXPS
N_BAGS = 10 # number of bags
N_DRAWS = 100 # number of draws
SMOOTHER = 1.E-2 # shrinkage parameter
EPSILON_BANDIT = 0.1 # epsilon-greedy bandit epsilon
EPSILON_NUM = 1.E-9 # numerical epsilon
@syhw
syhw / itc
Created September 3, 2019 04:40
#!/bin/bash
# tmux requires unrecognized OSC sequences to be wrapped with DCS tmux;
# <sequence> ST, and for all ESCs in <sequence> to be replaced with ESC ESC. It
# only accepts ESC backslash for ST.
function print_osc() {
if [[ -n $TERM ]] ; then
printf "\033Ptmux;\033\033]"
else
printf "\033]"
@syhw
syhw / dnn_compare_optims.py
Created July 16, 2014 19:21
to compare plain SGD vs. SAG vs. Adagrad vs. Adadelta
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared
@syhw
syhw / (FL)OPS
Created January 12, 2019 16:33
_theoretical_ FLOPS (counting FMA = 2 OPS)
category | model | FLOPS | HPOPS | AVX / cores / cuda cores...
CPU | i9-8950HK (MacBookPro) | 278 GFLOPS | 556 GOPS? | AVX2, 6 cores
CPU | E5-2698 v4 | 704 GFLOPS | 1.4 TOPS? | AVX2, 20 cores
CPU | Xeon Plat. 8160 (~GCP/AWS) | 3.2 TFLOPS | 6.4 TOPS? | AVX512 x2, 24 cores (2.1*24*512/32*2*2)
CPU | Threadripper 2990WX | 1.5 TFLOPS | 3 TOPS? | AVX2, 32 cores (3*32*256/32*2)
GPU | K80 | 8.7 TFLOPS | | 2496 cuda cores x2 (=4992)
GPU | 1080 Ti | 10.6 TFLOPS | | 3584:224:88
GPU | AMD RX Vega 64 | 11.5 TFLOPS | 23 TOPS | 4096:256:64
GPU | M60 | 9.6 TFLOPS | | 2048:128:64 x2 (i.e. x2 GPU dies)
GPU | P100 (NVLink) | 10.6 TFLOPS | 21 TOPS | 3584:224:88?
@syhw
syhw / dnn_compare_optims.py
Created July 21, 2014 09:07
comparing SGD vs SAG vs Adadelta vs Adagrad
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared
@syhw
syhw / RBM_checkers.py
Last active December 22, 2015 07:09
Encoding a checkerboard with an RBM
import numpy as np
import matplotlib.pyplot as plt
from sklearn.neural_network import BernoulliRBM
from sklearn import linear_model, metrics
from sklearn.pipeline import Pipeline
X = np.array([[0,1,0,1,0,1,0,1,
1,0,1,0,1,0,1,0,
0,1,0,1,0,1,0,1,
1,0,1,0,1,0,1,0,