Skip to content

Instantly share code, notes, and snippets.

@wiseodd
wiseodd / natural_grad.py
Created March 13, 2018 19:36
Natural Gradient Descent for Logistic Regression
import numpy as np
from sklearn.utils import shuffle
# Data comes from y = f(x) = [2, 3].x + [5, 7]
X0 = np.random.randn(100, 2) - 1
X1 = np.random.randn(100, 2) + 1
X = np.vstack([X0, X1])
t = np.vstack([np.zeros([100, 1]), np.ones([100, 1])])
@gabrieleangeletti
gabrieleangeletti / rbm_after_refactor.py
Last active July 27, 2021 14:32
Restricted Boltzmann Machine implementation in TensorFlow, before and after code refactoring. Blog post: http://blackecho.github.io/blog/programming/2016/02/21/refactoring-rbm-tensor-flow-implementation.html
import tensorflow as tf
import numpy as np
import os
import zconfig
import utils
class RBM(object):
@karpathy
karpathy / min-char-rnn.py
Last active March 27, 2024 21:13
Minimal character-level language model with a Vanilla Recurrent Neural Network, in Python/numpy
"""
Minimal character-level Vanilla RNN model. Written by Andrej Karpathy (@karpathy)
BSD License
"""
import numpy as np
# data I/O
data = open('input.txt', 'r').read() # should be simple plain text file
chars = list(set(data))
data_size, vocab_size = len(data), len(chars)
@neubig
neubig / plot-gp.py
Created November 19, 2014 02:25
A simple program to sample functions from a Gaussian process and plot them
#!/usr/bin/python
from math import exp
import numpy as np
import matplotlib.pyplot as plt
def rbf_kernel(x1, x2, variance = 1):
return exp(-1 * ((x1-x2) ** 2) / (2*variance))
def gram_matrix(xs):