Skip to content

Instantly share code, notes, and snippets.

@karan-ta
Last active March 13, 2020 16:05
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save karan-ta/f370698914f5b6b307f13ed8f96503bd to your computer and use it in GitHub Desktop.
Save karan-ta/f370698914f5b6b307f13ed8f96503bd to your computer and use it in GitHub Desktop.
Perceptron neural network in python from scratch - Under 10 lines of code - works for and gate , or gate
#Read this first - https://jontysinai.github.io/jekyll/update/2017/09/24/the-mcp-neuron.html
#Set The inputs and the desired Outputs for and gate.
x1,x2,y = [0,0,1,1],[0,1,0,1], [0,0,0,1]
#Set Random values for weights w1,w2 and for the bias b. initialize all to 0
w1,w2,b = 0,0,0
#try 100 iterations to see if the perceptron can guess that we need a fucking and gate.
for _ in range(100):
#iterate for each input:
for i in [0,1,2,3]:
#just guess a y value based on our random weights and bias
y_guess = w1*x1[i]+w2*x2[i]+b
# convert y_guess to 0 or 1
y_guess = 1 if y_guess > 0 else 0
#for the guessed y => calculate the error/difference from actual y value
error = y[i] - y_guess
#now tune/tweak/adjust w1 to reduce error in next iteration
w1 = w1 + 0.01*error*x1[i]
#now tune/tweak/adjust w2 to reduce error in next iteration
w2 = w2 + 0.01*error*x2[i]
#now tune/tweak/adjust bias to reduce error in next iteration
b = b + 0.01*error
#voila after 100 iterations , the neural network has figured out that the weights , bias to make an and gate for us :
print (w1*0+w2*0+b)>0
print (w1*0+w2*1+b)>0
print (w1*1+w2*0+b)>0
print (w1*1+w2*1+b)>0
#please follow me on twitter - @mrtechmaker
# print w1
# print w2
# print b
#inspiration => https://medium.com/@thomascountz/19-line-line-by-line-python-perceptron-b6f113b161f3
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment