Skip to content

Instantly share code, notes, and snippets.

View bicepjai's full-sized avatar

Jayaram Prabhu Durairaj bicepjai

View GitHub Profile
@bicepjai
bicepjai / AttentionWithContext.py
Last active October 24, 2017 01:25 — forked from rmdort/AttentionWithContext.py
Keras 2.0 Layer that implements an Attention mechanism, with a context/query vector, for temporal data. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
import tensorflow.contrib.keras as keras
import tensorflow as tf
from keras.engine import Layer, InputSpec
from keras import regularizers, initializers, constraints
from keras import backend as K
class AttentionWithContext(Layer):