Skip to content

Instantly share code, notes, and snippets.

@actsasbuffoon
Last active May 4, 2016 16:19
Show Gist options
  • Save actsasbuffoon/89e98ca1c2dc9a57eedc5e1c60d907fc to your computer and use it in GitHub Desktop.
Save actsasbuffoon/89e98ca1c2dc9a57eedc5e1c60d907fc to your computer and use it in GitHub Desktop.
A simple neural network example.
class Neuron
attr_accessor :inputs, :threshold, :weights
def initialize(threshold, weights)
@threshold = threshold
@weights = weights
@inputs = []
end
def output
weighted = inputs.zip(weights).map do |v, w|
# An input can be a number or another neuron. If it's a neuron, we need to get its output.
v = v.output if v.respond_to? :output
w * v
end
weighted.reduce(:+) >= threshold ? 1 : 0
end
end
# Layer 1 takes input from 2 pairs of imaginary switches.
# We want the net to return 1 if only one of each pair is active.
# Layer 1
# This neuron acts as a NAND gate.
l1a = Neuron.new(0, [-1, -1])
# This neuron acts as an AND gate.
l1b = Neuron.new(2, [1, 1])
l1c = Neuron.new(0, [-1, -1])
l1d = Neuron.new(2, [1, 1])
# Layer 2
# These are both NAND gates. Plugging a NAND and an AND into another NAND creates an XOR.
l2a = Neuron.new(0, [-1, -1])
l2a.inputs = [l1a, l1b]
l2b = Neuron.new(0, [-1, -1])
l2b.inputs = [l1c, l1d]
# Layer 3
# This is an AND gate.
l3 = Neuron.new(2, [1, 1])
l3.inputs = [l2a, l2b]
[0, 1].product([0, 1], [0, 1], [0, 1]).each do |inputs|
puts inputs.inspect
pair1 = inputs.take 2
pair2 = inputs.drop 2
l1a.inputs = pair1
l1b.inputs = pair1
l1c.inputs = pair2
l1d.inputs = pair2
puts l3.output
end
# === Output ===
# [0, 0, 0, 0]
# 0
# [0, 0, 0, 1]
# 0
# [0, 0, 1, 0]
# 0
# [0, 0, 1, 1]
# 0
# [0, 1, 0, 0]
# 0
# [0, 1, 0, 1]
# 1
# [0, 1, 1, 0]
# 1
# [0, 1, 1, 1]
# 0
# [1, 0, 0, 0]
# 0
# [1, 0, 0, 1]
# 1
# [1, 0, 1, 0]
# 1
# [1, 0, 1, 1]
# 0
# [1, 1, 0, 0]
# 0
# [1, 1, 0, 1]
# 0
# [1, 1, 1, 0]
# 0
# [1, 1, 1, 1]
# 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment