Skip to content

Instantly share code, notes, and snippets.

{
"metadata": {
"name": ""
},
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
@syhw
syhw / dnn.py
Created July 13, 2014 20:55
Deep learning in one file.
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
from theano import tensor as T
from theano import shared
from theano.tensor.shared_randomstreams import RandomStreams
@syhw
syhw / dropout_simple_models.py
Created July 17, 2014 14:55
Trying dropout with simple off-the-selves scikit-learn models. Not really working.
from sklearn.datasets import fetch_20newsgroups, load_digits
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.cross_validation import train_test_split
import numpy as np
from sklearn.naive_bayes import MultinomialNB, BernoulliNB
from sklearn.linear_model import LogisticRegression, SGDClassifier
from sklearn import metrics
newsgroups_train = fetch_20newsgroups(subset='train')
vectorizer = TfidfVectorizer(encoding='latin-1', max_features=10000)
@syhw
syhw / dnn_compare_optims.py
Created July 30, 2014 07:46
DNN in one file, with dropout(s) and SAG
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared
@syhw
syhw / 3 points
Last active November 16, 2015 16:54
Pourquoi un message M porté par une personne X est différent du message M porté par une personne Y:
0) Tout message M que tu peux partager en un article est incomplet! Il se base forcément sur un "sens commun", et des connaissances de
base en commun (note que pour les personnes X et Y, ses sous-entendus sont sans doute différents). Sinon ce message contiendrait toute
l'information pour recréer et la situation et le raisonnement. Donc ce qui se passe dans la tête de l'auditoire est I=f(M), avec f
dépendent du lecteur.
1) Déjà une bonne preuve (au sens mathématique) qui part des mauvais postulats, c'est très difficile à détecter, parce que l'on passe
plus de temps à regarder le raisonnement que les faits de départ. => Donc je me méfie des bonnes rhétoriques dans la bouche de gens qui
par ailleurs ont déjà menti.
@syhw
syhw / bear.pl
Last active November 17, 2015 00:49
#!/usr/bin/perl
use strict;
use warnings;
use Time::HiRes qw/usleep/;
my $b; &load_bear;
sub rd {
print "\n\e[17A";
}
@syhw
syhw / gist:cf6644c7fb73b02a0131
Created December 16, 2015 18:01
<your moba here> (DOTA 2) heroes embedding
5v5 matches
number of heroes in the pool = K
dimension of the embedding = E
- encode a hero as a one-hot of heroes = 1-of-K
- learn a (K, E) matrix to go from hero -> vector (+ bias)
(notice that it can do set-of-heroes -> vector too)
- learn a logistic regression from both the embeddings of team1 and team2 to predict the winner by backprop through the embedding.
- do stats and t-SNE plots of embeddings of single heroes or combinations (teams) of heroes
- ...
- PROFIT!!!
@syhw
syhw / gist:5909833
Created July 2, 2013 14:36
Gaussian processes for threshold finding / response curve fitting, to eliminate the staircase in psycho experiments.
from sklearn.gaussian_process import GaussianProcess
import numpy as np
import copy
from matplotlib import pyplot as pl
np.random.seed(1)
def f(x, alpha=5., beta=10.):
"""The function to predict: Weibull centered on 5, ranging from 1 to 2."""
#return x * np.sin(x)
@syhw
syhw / RBM_checkers.py
Last active December 22, 2015 07:09
Encoding a checkerboard with an RBM
import numpy as np
import matplotlib.pyplot as plt
from sklearn.neural_network import BernoulliRBM
from sklearn import linear_model, metrics
from sklearn.pipeline import Pipeline
X = np.array([[0,1,0,1,0,1,0,1,
1,0,1,0,1,0,1,0,
0,1,0,1,0,1,0,1,
1,0,1,0,1,0,1,0,
@syhw
syhw / dnn_compare_optims.py
Created July 21, 2014 09:07
comparing SGD vs SAG vs Adadelta vs Adagrad
"""
A deep neural network with or w/o dropout in one file.
"""
import numpy
import theano
import sys
import math
from theano import tensor as T
from theano import shared