Skip to content

Instantly share code, notes, and snippets.

@aicentral
Created February 12, 2017 19:37
Show Gist options
  • Save aicentral/483d6da8bb09337581a6635f999cca6a to your computer and use it in GitHub Desktop.
Save aicentral/483d6da8bb09337581a6635f999cca6a to your computer and use it in GitHub Desktop.
Gradient Descent
# Simple implementation for the example in page 96 in http://www.deeplearningbook.org/contents/numerical.html
# 4.5 Example: Linear Least Squares
import numpy as np
X=np.random.rand(2,1)
sigma=0.000001
epslon=0.03
A=np.array([[1,2],[3,4]])
B=np.array([1,1])
def gradient(A,X,B):
return np.dot(np.dot(np.transpose(A),A),X)-np.dot(np.transpose(A),B)
grad=gradient(A,X,B)
i=0
print 'Iteration',i,'Current gradient=',np.linalg.norm(grad,ord=2)
while np.linalg.norm(grad,ord=2)>sigma:
i+=1
if i%100==0:
print 'Iteration',i,'Current gradient=',np.linalg.norm(grad,ord=2)
X=X-epslon*grad
grad=gradient(A,X,B)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment