Skip to content

Instantly share code, notes, and snippets.

dterg

Block or report user

Report or block dterg

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
@dterg
dterg / Attention.py
Created Jun 4, 2018 — forked from cbaziotis/Attention.py
Keras Layer that implements an Attention mechanism for temporal data. Supports Masking. Follows the work of Raffel et al. [https://arxiv.org/abs/1512.08756]
View Attention.py
from keras import backend as K, initializers, regularizers, constraints
from keras.engine.topology import Layer
def dot_product(x, kernel):
"""
Wrapper for dot product operation, in order to be compatible with both
Theano and Tensorflow
Args:
@dterg
dterg / gensim2projector_tf.py
Created Aug 21, 2017 — forked from lampts/gensim2projector_tf.py
how to convert/port gensim word2vec to tensorflow projector board.
View gensim2projector_tf.py
# required tensorflow 0.12
# required gensim 0.13.3+ for new api model.wv.index2word or just use model.index2word
from gensim.models import Word2Vec
import tensorflow as tf
from tensorflow.contrib.tensorboard.plugins import projector
# loading your gensim
model = Word2Vec.load("YOUR-MODEL")
You can’t perform that action at this time.