Skip to content

Instantly share code, notes, and snippets.

@yukoba
Last active September 9, 2016 04:33
Show Gist options
  • Save yukoba/9f407b27b4d7b8b905c9fbbebd3e48cb to your computer and use it in GitHub Desktop.
Save yukoba/9f407b27b4d7b8b905c9fbbebd3e48cb to your computer and use it in GitHub Desktop.
AdaGrad using theano.scan()
import theano
import theano.tensor as T
# AdaGrad using theano.scan()
def fn(x, r, learning_rate):
y = x ** 2 - x
g = T.grad(y, x)
r += g ** 2
return x - learning_rate / T.sqrt(r) * g, r
init_x = T.dscalar()
init_r = T.dscalar()
result, updates = theano.scan(fn=fn,
outputs_info=(init_x, init_r),
non_sequences=0.3,
n_steps=100)
f2 = theano.function([init_x], result[0][-1], givens={init_r: 1e-8}, updates=updates)
print(f2(1)) # 0.5
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment