Skip to content

Instantly share code, notes, and snippets.

@ahwillia
Created October 30, 2016 06:52
Show Gist options
  • Star 10 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save ahwillia/8cedc710352eb919b684d8848bc2df3a to your computer and use it in GitHub Desktop.
Save ahwillia/8cedc710352eb919b684d8848bc2df3a to your computer and use it in GitHub Desktop.
Alternating Minimization in Tensorflow (PCA example)
import numpy as np
import tensorflow as tf
# N, size of matrix. R, rank of data
N = 100
R = 5
# generate data
W_true = np.random.randn(N,R)
C_true = np.random.randn(R,N)
Y_true = np.dot(W_true, C_true)
Y_tf = tf.constant(Y_true.astype(np.float32))
W = tf.Variable(np.random.randn(N,R).astype(np.float32), name='W')
C = tf.Variable(np.random.randn(R,N).astype(np.float32), name='C')
Y_est = tf.matmul(W,C)
loss = tf.reduce_sum((Y_tf-Y_est)**2)
# regularization
alpha = tf.constant(1e-4)
regW = alpha*tf.reduce_sum(W**2)
regC = alpha*tf.reduce_sum(C**2)
# full objective
objective = loss + regW + regC
# optimization setup
optimizer = tf.train.AdamOptimizer(0.001)
train_step = optimizer.minimize(objective, var_list=[W,C])
train_W = optimizer.minimize(objective, var_list=[W])
train_C = optimizer.minimize(objective, var_list=[C])
# fit the model
init_op = tf.initialize_all_variables()
with tf.Session() as sess:
sess.run(init_op)
for n in range(10000):
# update W
sess.run(train_W)
# update C
_,objval = sess.run([train_C, objective])
# print progress
if (n+1) % 1000 == 0:
print('iter %i, %f' % (n+1, objval))
@djmirv
Copy link

djmirv commented Jul 19, 2017

It's just a auto-encoder..??

@KFDing
Copy link

KFDing commented Aug 1, 2017

Is there any side effect that you train W, C separqately? Is it the same you put them together?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment