Skip to content

Instantly share code, notes, and snippets.

@yukoba
yukoba / EMTest.scala
Created January 9, 2017 07:19
混合正規分布のEMアルゴリズム
package jp.yukoba
import org.apache.commons.math3.distribution.NormalDistribution
import org.scalatest.FunSuite
import scala.util.Random
class EMTest extends FunSuite {
/** 統計的学習の基礎のp.312より */
test("混合正規分布のEMアルゴリズム") {
import numpy as np
epsilon = 1e-8
g2 = 0
for g in [1, 10, 100]:
for t in range(10):
g2 += g ** 2
dx = -5e-4 / np.sqrt(g2 + epsilon) * g
print(t, dx)
import numpy as np
epsilon = 1e-8
g2 = 0
for g in [1, 10, 100]:
for t in range(10):
g2 = 0.95 * g2 + 0.05 * g ** 2
dx = -1e-4 / np.sqrt(g2 + epsilon) * g
print(t, dx)
import numpy as np
epsilon = 1e-8
dx2 = 0
g2 = 0
for g in [1, 10, 100]:
for t in range(10):
g2 = 0.95 * g2 + 0.05 * g ** 2
dx = -np.sqrt(dx2 + epsilon) / np.sqrt(g2 + epsilon) * g
@yukoba
yukoba / nested_theano_scan_adagrad.py
Last active September 10, 2016 07:05
AdaGrad + stochastic gradient descent using nested theano.scan()
import numpy as np
import theano
import theano.tensor as T
from theano.tensor.shared_randomstreams import RandomStreams
# AdaGrad + stochastic gradient descent using nested theano.scan()
train_x = np.random.rand(100)
train_y = train_x + np.random.rand(100) * 0.01
@yukoba
yukoba / theano_scan_adagrad_stochastic_gradient_descent.py
Last active September 10, 2016 06:12
AdaGrad + stochastic gradient descent using theano.scan()
import numpy as np
import theano
import theano.tensor as T
from theano.tensor.shared_randomstreams import RandomStreams
# AdaGrad + stochastic gradient descent using theano.scan()
train_x = np.random.rand(100)
train_y = train_x + np.random.rand(100) * 0.01
@yukoba
yukoba / theano_scan_adagrad.py
Last active September 9, 2016 04:33
AdaGrad using theano.scan()
import theano
import theano.tensor as T
# AdaGrad using theano.scan()
def fn(x, r, learning_rate):
y = x ** 2 - x
g = T.grad(y, x)
r += g ** 2
return x - learning_rate / T.sqrt(r) * g, r
@yukoba
yukoba / theano_scan_steepest_descent_method.py
Last active September 9, 2016 04:39
Steepest descent method using theano.scan()
import theano
import theano.tensor as T
# Steepest descent method using theano.scan()
def fn(x, learning_rate):
y = x ** 2 - x
return x - learning_rate * T.grad(y, x)
init_x = T.dscalar()
@yukoba
yukoba / autograd_benchmark.py
Created June 19, 2016 01:44
AutoGrad benchmark of matmul
import timeit
import autograd.numpy as np
from autograd import grad
def fn(a, b):
return np.sum(a @ b)
n = 1000
@yukoba
yukoba / test_two_queues.py
Last active November 6, 2015 06:57
Facebookに書いた複数のキューから優先度の合計の最小を取ってくるやつ Ver.2
from bisect import bisect_left
from heapq import heapify, heappop, heappush
from itertools import islice
from sys import maxsize
# サンプルデータ
pq0 = [(1, "a"), (2, "b"), (3, "f")]
pq1 = [(1, "c"), (2, "d"), (3, "g")]
heapify(pq0)
heapify(pq1)