- an algorithm to solve non-linear least squares minimization problems
- finds a local minima, not necessarily the global minima
- alternatives: Gauss–Newton, gradient-descent
- LM algorithm combines the advantages of gradient-descent and Gauss-Newton
- allows for initial values to be further away from solution than Gauss-Newton
- gradient-descent dominates until a canyon is reached, after which Gauss-Newton takes over
- set of m data points, (x1, y1), (x2, y2), ..., (xm, ym)
- curve function y = f (x, b)
- n parameters, b = (b1, b2, ..., bn)
- residuals (errors) ri, for i from 1 to m, are given by ri = yi - f (xi, b)
- find b such that the sum of the sum of the squared residuals is minimized, S = sumi=1..m ( ri2 )