Skip to content

Instantly share code, notes, and snippets.

@billju
Last active December 2, 2021 01:57
Show Gist options
  • Save billju/c96be04400fbadd58ace52f036cd90f2 to your computer and use it in GitHub Desktop.
Save billju/c96be04400fbadd58ace52f036cd90f2 to your computer and use it in GitHub Desktop.
import numpy as np
# activation function
sigmoid = lambda x: 1/(1+np.exp(-x))
# set input, output and hyperparameter
X = np.array([
[1,0,1]
])
y = np.array([
[1]
])
lr = 0.9
# initiate weight and bias
w1 = np.array([
[0.2,-0.3],
[0.4,0.1],
[-0.5,0.2]
])
b1 = np.array([
[-0.4,0.2]
])
w2 = np.array([
[-0.3],
[-0.2]
])
b2 = np.array([
[0.1]
])
# forward
o1 = sigmoid(X.dot(w1)+b1)
o2 = sigmoid(o1.dot(w2)+b2)
# backward
bp2 = o2*(1-o2)*(y-o2)
bp1 = o1*(1-o1)*bp2*w2.T
# update
b2 += lr*bp2
w2 += np.outer(o1, lr*bp2)
b1 += lr*bp1
w1 += np.outer(X, lr*bp1)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment