Skip to content

Instantly share code, notes, and snippets.

@Millsky
Last active April 3, 2016 16:09
Show Gist options
  • Save Millsky/4285d6a98465b70023385896a8502e4a to your computer and use it in GitHub Desktop.
Save Millsky/4285d6a98465b70023385896a8502e4a to your computer and use it in GitHub Desktop.
A Implementation of a single perceptron that predicts the output of a linear function
import numpy as np
inputs = np.array([[1],[100],[3],[30],[40]])
testSet = np.array([[10],[13],[19],[33],[1]])
outputs = np.array([[5],[203],[9],[63],[83]])
bias = 1
biasWeight = 1
#initWeights
weights = np.random.random()
print inputs.shape
def sigmoid(x):
return 1/(1+np.exp(-x))
for j in xrange(1, 70000):
for k in xrange(len(inputs)):
x = (inputs[k] * weights) + (1 * biasWeight)
x = sigmoid(x)
biasWeight = biasWeight + sigmoid(outputs[k]) - x
weights = weights + ((sigmoid(outputs[k]) - x) * inputs[k])
if j%10000 == 0:
print weights
print (testSet * weights) + (bias * biasWeight)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment