Skip to content

Instantly share code, notes, and snippets.

@dterg
dterg / Attention.py
Created June 4, 2018 16:43 — forked from cbaziotis/Attention.py
Keras Layer that implements an Attention mechanism for temporal data. Supports Masking. Follows the work of Raffel et al. [https://arxiv.org/abs/1512.08756]
from keras import backend as K, initializers, regularizers, constraints
from keras.engine.topology import Layer
def dot_product(x, kernel):
"""
Wrapper for dot product operation, in order to be compatible with both
Theano and Tensorflow
Args:
@dterg
dterg / gensim2projector_tf.py
Created August 21, 2017 23:32 — forked from lampts/gensim2projector_tf.py
how to convert/port gensim word2vec to tensorflow projector board.
# required tensorflow 0.12
# required gensim 0.13.3+ for new api model.wv.index2word or just use model.index2word
from gensim.models import Word2Vec
import tensorflow as tf
from tensorflow.contrib.tensorboard.plugins import projector
# loading your gensim
model = Word2Vec.load("YOUR-MODEL")