Skip to content

Instantly share code, notes, and snippets.

@SarvagyaVaish
Last active November 4, 2018 01:08
Show Gist options
  • Save SarvagyaVaish/1b8812a270ea4c243896fddced7f8b1b to your computer and use it in GitHub Desktop.
Save SarvagyaVaish/1b8812a270ea4c243896fddced7f8b1b to your computer and use it in GitHub Desktop.

What is it?

  • an algorithm to solve non-linear least squares minimization problems
  • finds a local minima, not necessarily the global minima
  • alternatives: Gauss–Newton, gradient-descent

Why use it?

  • LM algorithm combines the advantages of gradient-descent and Gauss-Newton
  • allows for initial values to be further away from solution than Gauss-Newton
  • gradient-descent dominates until a canyon is reached, after which Gauss-Newton takes over

Mathematical representation

  • set of m data points, (x1, y1), (x2, y2), ..., (xm, ym)
  • curve function y = f (x, b)
  • n parameters, b = (b1, b2, ..., bn)
  • residuals (errors) ri, for i from 1 to m, are given by ri = yi - f (xi, b)
  • find b such that the sum of the sum of the squared residuals is minimized, S = sumi=1..m ( ri2 )

Sources

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment