Skip to content

Instantly share code, notes, and snippets.

@anjalibhavan
Created August 5, 2018 17:44
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save anjalibhavan/7ea4a375d784c6b00d4d2291bdc602a0 to your computer and use it in GitHub Desktop.
Save anjalibhavan/7ea4a375d784c6b00d4d2291bdc602a0 to your computer and use it in GitHub Desktop.
Basic Neural Network Implementation Using Numpy
import numpy as np
# N is batch size; D_in is input dimension;
# H is hidden dimension; D_out is output dimension.
m, D_in, H, D_out, learning_rate = 64, 1000, 100, 10, 0.1
# Create random input and output data
x = np.random.randn(m, D_in)
y = np.random.randn(m, D_out)
def sigmoid(Z):
return 1/(1+np.exp(-Z))
# Initialize weights
W1 = np.random.randn(H,D_in)*0.01
b1 = np.random.randn(H,1)
W2 = np.random.randn(D_out,H)*0.01
b2 = np.random.randn(D_out,1)
for t in range(100):
# Forward prop
Z1 = np.dot(W1, x.T) + b1
A1 = sigmoid(Z1)
Z2 = np.dot(W2, A1) + b2
A2 = sigmoid(Z2)
# Backprop
dz2 = A2 - y.T
dw2 = (1/m)*np.dot(dz2, A1.T)
db2 = (1/m)*np.sum(dz2,axis=1,keepdims=True)
dz1 = np.dot(W2.T,dz2)*Z1*(1-Z1)
dw1 = (1/m)*np.dot(dz1,x)
db1 = (1/m)*np.sum(dz1,axis=1,keepdims=True)
# Update
W1 -= learning_rate*dw1
W2 -= learning_rate*dw2
b1 -= learning_rate*db1
b2 -= learning_rate*db2
print('done!')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment