Skip to content

Instantly share code, notes, and snippets.

@InnerPeace-Wu
Last active October 4, 2017 03:25
Show Gist options
  • Save InnerPeace-Wu/d3dcaee1876c7b742a25019f8bc66835 to your computer and use it in GitHub Desktop.
Save InnerPeace-Wu/d3dcaee1876c7b742a25019f8bc66835 to your computer and use it in GitHub Desktop.
Demo of implementation "attention" in tensorflow
import tensorflow as tf
def tf_attention():
a = tf.constant([[2,3],[4,5]])
b = tf.constant([[[1,1,1],[2,2,2]],[[3,3,3],[4,4,4]]])
##
#what we want is: [[[2,2,2],[6,6,6]],[[12,12,12],[20,20,20]]]
#then sum it over dimension-1 to get: [[ 8 8 8], [32 32 32]]
##
d = tf.expand_dims(a, 2)
e = tf.tile(d,[1,1,3])
# two ways of element-wise muliplication.
#c = tf.multiply(b, e)
c = b*e
f = tf.reduce_sum(c, axis=1)
with tf.Session() as sess:
print(sess.run(e))
print(sess.run(c))
print(sess.run(f))
if __name__ == '__main__':
tf_attention()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment