Skip to content

Instantly share code, notes, and snippets.

@udibr
udibr / gruln.py
Last active November 7, 2020 02:34
Keras GRU with Layer Normalization
import numpy as np
from keras.layers import GRU, initializations, K
from collections import OrderedDict
class GRULN(GRU):
'''Gated Recurrent Unit with Layer Normalization
Current impelemtation only works with consume_less = 'gpu' which is already
set.
# Arguments
@colah
colah / translations.md
Last active December 28, 2019 06:31
A list of translations of posts from colah.github.io
@berak
berak / recurrent_network.py
Last active July 3, 2017 12:39
tensorflow
'''
A Reccurent Neural Network (LSTM) implementation example using TensorFlow library.
This example is using the MNIST database of handwritten digits (http://yann.lecun.com/exdb/mnist/)
Long Short Term Memory paper: http://deeplearning.cs.cmu.edu/pdfs/Hochreiter97_lstm.pdf
Author: Aymeric Damien
Project: https://github.com/aymericdamien/TensorFlow-Examples/
'''
import cv2
import tensorflow as tf
import tensorflow.examples.tutorials.mnist.input_data as input_data
@berak
berak / 5_convolutional_net.py
Last active July 3, 2017 12:39
tensorflow
import cv2
import numpy as np
import tensorflow as tf
import tensorflow.examples.tutorials.mnist.input_data as input_data
def init_weights(shape):
return tf.Variable(tf.random_normal(shape, stddev=0.01))
def model(X, w_h, w_o):

Git Cheat Sheet

Commands

Getting Started

git init

or