Skip to content

Instantly share code, notes, and snippets.

@skeeet
skeeet / residual_lstm_keras.py
Created May 5, 2017 05:46 — forked from bzamecnik/model_summary.txt
Residual LSTM in Keras
def make_residual_lstm_layers(input, rnn_depth, rnn_dropout):
"""
The intermediate LSTM layers return sequences, while the last returns a single element.
The input is also a sequence. In order to match the shape of input and output of the LSTM
to sum them we can do it only for all layers but the last.
"""
for i in range(rnn_depth):
return_sequences = i < rnn_depth - 1
x_rnn = LSTM(rnn_width, dropout_W=rnn_dropout, dropout_U=rnn_dropout, return_sequences=return_sequences)(input)
if return_sequences:
@skeeet
skeeet / rnn_viz_keras.py
Created May 4, 2017 05:56 — forked from tokestermw/rnn_viz_keras.py
Recurrent Neural Network (RNN) visualizations using Keras.
from __future__ import print_function
from keras import backend as K
from keras.engine import Input, Model, InputSpec
from keras.layers import Dense, Activation, Dropout, Lambda
from keras.layers import Embedding, LSTM
from keras.optimizers import Adam
from keras.preprocessing import sequence
from keras.utils.data_utils import get_file
from keras.datasets import imdb
@skeeet
skeeet / AttentionWithContext.py
Created May 4, 2017 05:49 — forked from nigeljyng/AttentionWithContext.py
Keras Layer that implements an Attention mechanism, with a context/query vector, for temporal data. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
class AttentionWithContext(Layer):
"""
Attention operation, with a context/query vector, for temporal data.
Supports Masking.
Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf]
"Hierarchical Attention Networks for Document Classification"
by using a context vector to assist the attention
# Input shape
3D tensor with shape: `(samples, steps, features)`.
# Output shape
"""
A weighted version of categorical_crossentropy for keras (1.1.0). This lets you apply a weight to unbalanced classes.
@url: https://gist.github.com/wassname/ce364fddfc8a025bfab4348cf5de852d
@author: wassname
"""
from keras import backend as K
class weighted_categorical_crossentropy(object):
"""
A weighted version of keras.objectives.categorical_crossentropy
@skeeet
skeeet / keras_attention_wrapper.py
Created April 12, 2017 04:39 — forked from wassname/keras_attention_wrapper.py
A keras attention layer that wraps RNN layers.
"""
A keras attention layer that wraps RNN layers.
Based on tensorflows [attention_decoder](https://github.com/tensorflow/tensorflow/blob/c8a45a8e236776bed1d14fd71f3b6755bd63cc58/tensorflow/python/ops/seq2seq.py#L506)
and [Grammar as a Foreign Language](https://arxiv.org/abs/1412.7449).
date: 20161101
author: wassname
url: https://gist.github.com/wassname/5292f95000e409e239b9dc973295327a
"""
@skeeet
skeeet / fail.py
Created April 3, 2017 10:27 — forked from guicho271828/fail.py
minimal failure cases, only on tensorflow backend
from keras.layers import Input, Dense
from keras.models import Model, Sequential
from keras.datasets import mnist
from keras.layers.normalization import BatchNormalization as BN
autoencoder1 = Sequential([
Dense(128, activation='relu',input_shape=(784,)),
BN(),
Dense(784, activation='relu'),
])
# Example for my blog post at:
# http://danijar.com/introduction-to-recurrent-networks-in-tensorflow/
import functools
import sets
import tensorflow as tf
def lazy_property(function):
attribute = '_' + function.__name__
@skeeet
skeeet / ffmppeg-advanced-playbook-nvenc-and-libav-and-vaapi.md
Created March 24, 2017 11:05 — forked from Brainiarc7/ffmppeg-advanced-playbook-nvenc-and-libav-and-vaapi.md
FFMpeg's playbook: Advanced encoding options with hardware-accelerated acceleration for both NVIDIA NVENC's and Intel's VAAPI-based hardware encoders in both ffmpeg and libav.

FFmpeg and libav's playbook: Advanced encoding options with hardware-based acceleration, NVIDIA's NVENC and Intel's VAAPI-based encoder.

Hello guys,

Continuing from this guide to building ffmpeg and libav with NVENC and VAAPI enabled, this snippet will cover advanced options that you can use with ffmpeg and libav on both NVENC and VAAPI hardware-based encoders.

For ffmpeg:

@skeeet
skeeet / async_worker_pool.py
Created March 15, 2017 10:19 — forked from thehesiod/async_worker_pool.py
Asynchronous Worker Pool
import asyncio
from datetime import datetime, timezone
import os
def utc_now():
# utcnow returns a naive datetime, so we have to set the timezone manually <sigh>
return datetime.utcnow().replace(tzinfo=timezone.utc)
class Terminator:
pass
@skeeet
skeeet / setup.md
Created March 13, 2017 20:22 — forked from fortunto2/setup.md
Setup Amazon AWS EC2 g2.2xlarge instance with OpenCV 3.1, Cuda 7.5, ffmpeg, OpenFace