Skip to content

Instantly share code, notes, and snippets.

View EderSantana's full-sized avatar
🎯
Focusing

Eder Santana EderSantana

🎯
Focusing
View GitHub Profile
@EderSantana
EderSantana / mlp.py
Last active August 29, 2015 13:57
Mod for pylearn2.models.mlp.py: This makes tied biases the default for ConvRectifiedLinear layers
"""
Multilayer Perceptron
"""
__authors__ = "Ian Goodfellow"
__copyright__ = "Copyright 2012-2013, Universite de Montreal"
__credits__ = ["Ian Goodfellow", "David Warde-Farley"]
__license__ = "3-clause BSD"
__maintainer__ = "Ian Goodfellow"
import math
@EderSantana
EderSantana / test_conv_relu.yaml
Created March 12, 2014 22:58
Test tied biases for ConvRectifiedLinear
!obj:pylearn2.train.Train {
dataset: &train !obj:pylearn2.datasets.mnist.MNIST {
which_set: 'train',
one_hot: 1,
start: 0,
stop: 50000
},
model: !obj:pylearn2.models.mlp.MLP {
batch_size: 128,
layers: [
@EderSantana
EderSantana / test_conv_maxout.yaml
Created March 12, 2014 22:59
Test tied biases for ConvMaxoutC01B
!obj:pylearn2.train.Train {
dataset: &train !obj:pylearn2.datasets.mnist.MNIST {
which_set: 'train',
one_hot: 1,
axes: ['c', 0, 1, 'b'],
start: 0,
stop: 50000
},
model: !obj:pylearn2.models.mlp.MLP {
batch_size: 128,
@EderSantana
EderSantana / gist:9916675
Last active September 29, 2016 00:31
KLMS example
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@EderSantana
EderSantana / hw2_part2.ipynb
Last active March 20, 2016 03:49
hw2_part2
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@EderSantana
EderSantana / gist:6d969169c0d7eddb6d0c
Last active August 29, 2015 14:17
Conditioned Recurrent
class ConditionedRecurrent(BaseRecurrent):
def __init__(self, wrapped, **kwargs):
super(ConditionedRecurrent, self).__init__(**kwargs)
self.wrapped = wrapped
self.children=[wrapped]
def get_dim(self, name):
if name == 'context':
return self.wrapped.get_dim('inputs')
@EderSantana
EderSantana / test_batch_norm.py
Last active August 29, 2015 14:20
Blocks Batch Normalization Testing
# You are supposed to run this after the `Batch Normalization Tutorial`
import theano
import numpy as np
outputs = VariableFilter(
bricks=mlp.linear_transformations, roles=[OUTPUT])(cg_bn.variables)
f = []
for o, g, b in zip(outputs, gammas, betas):
f.append(
@EderSantana
EderSantana / draw_config.py
Last active October 1, 2015 03:57
DRAW config
# We are training this DRAW network
# using a model similar to the one
# described in the paper http://arxiv.org/pdf/1502.04623.pdf
#
# Dataset: Binary-MNIST from mila-udem/fuel
from keras.initializations import normal
from seya.layers.draw import DRAW
def myinit(shape):
return normal(shape, scale=.01)
# NOTE: I'm not sure if this is right
from keras.layers.recurrent import LSTM
class LSTMpeephole(LSTM):
def __init__(self, **kwargs):
super(LSTMpeephole, self).__init__(**kwargs)
def build(self):
super(LSTMpeephole, self).build()