Skip to content

Instantly share code, notes, and snippets.

@kballenegger
Created March 3, 2012 02:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kballenegger/bafcc45bd9ae332c1238 to your computer and use it in GitHub Desktop.
Save kballenegger/bafcc45bd9ae332c1238 to your computer and use it in GitHub Desktop.
def gradient_descent learning_rate, precision_magnitude, &f
thetas = []
(1..f.arity).each { thetas.push 0 }
good = false
until good
new_thetas = thetas
good = true
thetas.each_index do |j|
prime = (derivative(precision_magnitude) { |x| tmp_thetas = thetas; tmp_thetas[j] = x; f.call(tmp_thetas) }).call(thetas[j])
new_thetas[j] = thetas[j] - learning_rate * prime
good = false if prime.abs > (10 ** (0-precision_magnitude))
end
thetas = new_thetas
end
thetas
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment