Skip to content

Instantly share code, notes, and snippets.

View ilyaivensky's full-sized avatar

Ilya ilyaivensky

View GitHub Profile
@ilyaivensky
ilyaivensky / dropout_lstm.py
Last active September 3, 2018 07:32 — forked from rasmusbergpalm/dropout_lstm.py
Lasagne LSTM w. dropout
from lasagne import *
from lasagne.layers import *
from lasagne.random import get_rng
from lasagne.utils import *
import numpy as np
import theano.tensor as T
from theano.tensor.shared_randomstreams import RandomStreams
class DropoutLSTMLayer(MergeLayer):
@ilyaivensky
ilyaivensky / simple_gan.py
Created March 14, 2017 21:12 — forked from Newmu/simple_gan.py
Simple Generative Adversarial Network Demo
import os
import numpy as np
from matplotlib import pyplot as plt
from time import time
from foxhound import activations
from foxhound import updates
from foxhound import inits
from foxhound.theano_utils import floatX, sharedX
@ilyaivensky
ilyaivensky / adam.py
Created February 23, 2017 23:23 — forked from skaae/adam.py
def adam(loss, all_params, learning_rate=0.001, b1=0.9, b2=0.999, e=1e-8,
gamma=1-1e-8):
"""
ADAM update rules
Default values are taken from [Kingma2014]
References:
[Kingma2014] Kingma, Diederik, and Jimmy Ba.
"Adam: A Method for Stochastic Optimization."
arXiv preprint arXiv:1412.6980 (2014).