Skip to content

Instantly share code, notes, and snippets.

@alinazhanguwo
Created March 22, 2019 23:02
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save alinazhanguwo/23478957afcc7b8d505c92fee6c4b430 to your computer and use it in GitHub Desktop.
Save alinazhanguwo/23478957afcc7b8d505c92fee6c4b430 to your computer and use it in GitHub Desktop.
# Create an interactive Tensorflow session
sess = tf.InteractiveSession()
# These will be inputs for the model
# Input pixels of images, flattened
# 1296 = 36*36 which is the size of images
x = tf.placeholder("float", [None, 1296])
## Known labels
y_ = tf.placeholder("float", [None,2])
# Hidden layer 1 with 256 neurons
num_hidden1 = 256
# Variables
# W1 is for weights
# b1 is for bias
W1 = tf.Variable(tf.truncated_normal([1296,num_hidden1],
stddev=1./math.sqrt(1296)))
b1 = tf.Variable(tf.constant(0.1,shape=[num_hidden1]))
# Compute the activation function of the weighted sum -> produce 128 intermediate value
# Nonlinear transform functions - activation function: sigmoid
h1 = tf.sigmoid(tf.matmul(x,W1) + b1)
# Hidden Layer 2
num_hidden2 = 64 with 64 neurons
W2 = tf.Variable(tf.truncated_normal([num_hidden1,
num_hidden2],stddev=2./math.sqrt(num_hidden1)))
b2 = tf.Variable(tf.constant(0.2,shape=[num_hidden2]))
h2 = tf.sigmoid(tf.matmul(h1,W2) + b2)
# Output Layer
# Logistic regression again
W3 = tf.Variable(tf.truncated_normal([num_hidden2, 2],
stddev=1./math.sqrt(2)))
b3 = tf.Variable(tf.constant(0.1,shape=[2]))
# Just initialize
sess.run(tf.global_variables_initializer())
# Define model
y = tf.nn.softmax(tf.matmul(h2,W3) + b3)
# Finish model specification, let us start training the model
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment