Skip to content

Instantly share code, notes, and snippets.

@JoshVarty
Last active January 18, 2018 03:15
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save JoshVarty/797b610023b6d47ac0a8022aab5819c4 to your computer and use it in GitHub Desktop.
Save JoshVarty/797b610023b6d47ac0a8022aab5819c4 to your computer and use it in GitHub Desktop.
graph = tf.Graph()
with graph.as_default():
input = tf.placeholder(tf.float32, shape=(100, 784))
labels = tf.placeholder(tf.float32, shape=(100, 10))
layer1_weights = tf.Variable(tf.random_normal([784, 10]))
layer1_bias = tf.Variable(tf.zeros([10]))
logits = tf.matmul(input, layer1_weights) + layer1_bias
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=labels))
learning_rate = 0.01
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment