Skip to content

Instantly share code, notes, and snippets.

View FragLegs's full-sized avatar

Shayne Miel (he/him) FragLegs

  • Duo Security
  • Durham, NC
View GitHub Profile
@FragLegs
FragLegs / attention_lstm.py
Created April 28, 2017 03:03 — forked from mbollmann/attention_lstm.py
My attempt at creating an LSTM with attention in Keras
class AttentionLSTM(LSTM):
"""LSTM with attention mechanism
This is an LSTM incorporating an attention mechanism into its hidden states.
Currently, the context vector calculated from the attended vector is fed
into the model's internal states, closely following the model by Xu et al.
(2016, Sec. 3.1.2), using a soft attention model following
Bahdanau et al. (2014).
The layer expects two inputs instead of the usual one: