Skip to content

Instantly share code, notes, and snippets.

@Maxim-Mazurok
Created January 26, 2020 21:29
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Maxim-Mazurok/0d0b77150f9518c0652eaedd318edbc7 to your computer and use it in GitHub Desktop.
Save Maxim-Mazurok/0d0b77150f9518c0652eaedd318edbc7 to your computer and use it in GitHub Desktop.
Demo of very basic Neural Network in Python 3 (Siraj Raval - Build a Neural Net in 4 Minutes)
# Credits: https://www.youtube.com/watch?v=h3l4qz76JhQ
# Basic description: teach NN to output [0, 1, 1, 0] for different inputs
import numpy as np
def nonlin(x, deriv=False):
if(deriv == True):
return x*(1-x)
return 1/(1+np.exp(-x))
# input data
x = np.array([[0, 0, 1],
[0, 1, 1],
[1, 0, 1],
[1, 1, 1]])
# output data
y = np.array([[0],
[1],
[1],
[0]])
np.random.seed(1)
# synapses
syn0 = 2 * np.random.random((3, 4))-1 # input
syn1 = 2 * np.random.random((4, 1))-1 # output
# training step
for j in range(60000):
l0 = x
l1 = nonlin(np.dot(l0, syn0))
l2 = nonlin(np.dot(l1, syn1))
l2_error = y - l2
if j % 10000 == 0:
print("Error:" + str(np.mean(np.abs(l2_error))))
l2_delta = l2_error * nonlin(l2, deriv=True)
l1_error = l2_delta.dot(syn1.T)
l1_delta = l1_error * nonlin(l1, deriv=True)
# update weights
syn1 += l1.T.dot(l2_delta)
syn0 += l0.T.dot(l1_delta)
print("Output after training:")
print(l2)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment