Skip to content

Instantly share code, notes, and snippets.

@pkofod
Created January 24, 2017 09:28
Show Gist options
  • Save pkofod/b4e0338265fac7ab1f4aa87a7d3d655f to your computer and use it in GitHub Desktop.
Save pkofod/b4e0338265fac7ab1f4aa87a7d3d655f to your computer and use it in GitHub Desktop.
using Optim, Calculus
# Let's try with a polynomial. It has very simple Hessian, as there are no cross
# products, only quadratic terms. Hence the Hessian is a diagonal matrix where
# diag(H) = [2, 2, ..., 2]. Let's try to see if that's what we get!
function large_polynomial(x::Vector)
res = zero(x[1])
for i in 1:250
res += (i - x[i])^2
end
return res
end
# Define the Optim-compatible Hessian approximation
h!(x::Vector, storage::Matrix) = Calculus.finite_difference_hessian!(large_polynomial, x, storage)
# Let's take initial values and solution from Optim.
prob = Optim.UnconstrainedProblems.examples["Large Polynomial"]
# Get values
initial_x = prob.initial_x
minimizer = prob.solutions
# Build matrix with solution
h_stor = ones(length(initial_x), length(initial_x))
h!(minimizer, h_stor)
# Great, it works. Now from the initial value
h_stor = ones(length(initial_x), length(initial_x))
h!(initial_x, h_stor)
# Oh no. Let's see what's wrong? sum(h_stor) is around 1757, but should be 500.
sum(h_stor)
# Hmm, weird. Let's try to see where the offending entries are
sparse(h_stor)
# They're all around, and quite large!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment