Skip to content

Instantly share code, notes, and snippets.

View iridiumblue's full-sized avatar
💭
Heads down on deep learning ...

iridiumblue

💭
Heads down on deep learning ...
View GitHub Profile
@iridiumblue
iridiumblue / AttentionWithContext.py
Last active October 5, 2020 17:16 — forked from cbaziotis/AttentionWithContext.py
Keras Layer that implements an Attention mechanism, compatible with Eager Execution and TF 1.13. Supports Masking. Follows the work of Yang et al. [https://www.cs.cmu.edu/~diyiy/docs/naacl16.pdf] "Hierarchical Attention Networks for Document Classification"
# AttentionWithContext adapted for Tensorflow 1.13 with Eager Execution.
# IMPORTANT -you can't use regular keras optimizers. You need to grab one that is subclassed from
# tf.train.Optimizer. Not to worry, your favorite is probably there, for example -
# https://www.tensorflow.org/api_docs/python/tf/train/AdamOptimizer
# That's it, now you can use this layer -
# Adapted from https://gist.github.com/cbaziotis/7ef97ccf71cbc14366835198c09809d2
# Tested using functional API. Just plop on top of an RNN, like so -