Skip to content

Instantly share code, notes, and snippets.

@antishok
Last active July 25, 2017 23:36
Show Gist options
  • Save antishok/5a006bbafd95a04df36ac070a2057a7e to your computer and use it in GitHub Desktop.
Save antishok/5a006bbafd95a04df36ac070a2057a7e to your computer and use it in GitHub Desktop.
tensorflow 101

Node types:

  • constants (no inputs, outputs a value it stores internally) tf.constant(3.0)
  • operations (inputs to outputs) tf.add(node1, node2) or node1 + node2
  • (Variables and placeholders are operations too. For variables, an 'assign' op might also be created for assigning the initial value)

Tensor shapes:

3 # a rank 0 tensor; this is a scalar with shape []
[1. ,2., 3.] # a rank 1 tensor; this is a vector with shape [3]
[[1., 2., 3.], [4., 5., 6.]] # a rank 2 tensor; a matrix with shape [2, 3]
[[[1., 2., 3.]], [[7., 8., 9.]]] # a rank 3 tensor with shape [2, 1, 3]

Tensorflow 101:

  • A computational graph is a series of TF operations arranged into a graph of nodes.

  • Each node takes zero or more tensors as inputs and produces a tensor as an output.

  • To actually evaluate the nodes, we must run the computational graph within a session: sess = tf.Session(); sess.run([node1, node2])

  • A graph can be parameterized to accept external inputs, known as placeholders. A placeholder is a promise to provide a value later.

  • a = tf.placeholder(tf.float32), adder_node = a + b

  • To specify the placeholder values (tensors) when evaluating the graph, pass a feed_dict parameter: sess.run(adder_node, {a: [1,3], b: [2, 4]})

  • Variables allow us to add trainable parameters to a graph. They are constructed with a type and initial value:

W = tf.Variable([.3], dtype=tf.float32)
b = tf.Variable([-.3], dtype=tf.float32)
x = tf.placeholder(tf.float32)
linear_model = W * x + b
  • To initialize all the variables in a TensorFlow program:

init = tf.global_variables_initializer(); sess.run(init)

  • Example of evaluating model's loss:
y = tf.placeholder(tf.float32)
squared_deltas = tf.square(linear_model - y)
loss = tf.reduce_sum(squared_deltas)
print(sess.run(loss, {x:[1,2,3,4], y:[0,-1,-2,-3]}))

Training:

  • Variable values can be changed with e.g. tf.assign operation: fixW = tf.assign(W, [-1.]); sess.run(fixW)
  • But usually we'll train with an optimizer to minimize the loss by changing the variables' values:
optimizer = tf.train.GradientDescentOptimizer(0.01)
train_op = optimizer.minimize(loss)

sess.run(init) # reset values to incorrect defaults.
for i in range(1000):
  sess.run(train, {x:[1,2,3,4], y:[0,-1,-2,-3]})

print(sess.run([W, b]))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment