Nonlinear least squares regression of multiple weighted simultaneous equations using a Gauss Newton, Levenberg, or Levenberg-Marquardt algorithm

by Trent Guidry4. December 2011 08:31

Gauss Newton, Levenberg, and Levenberg-Marquardt algorithms can be used for the nonlinear least squares regression of parameters for systems of simultaneous equations.

 

 

These nonlinear least squares regression algorithms work as follows:

  

1.  Calculate the weight matrix W. The weight matrix is a diagonal matrix of the weights. This matrix has the same number of rows as the error matrix.  Each row in the weight matrix corresponds to the same row in the error matrix.

 

2.  Start with initial parameter guess values P0.  If using the Levenberg or Levenberg Marquardt algorithm, start with an initial value of lambda.

 

3.  Calculate the Jacobian matrix J.  This involves computing the partial first derivatives of the regression functions at the data points with respect to the parameters.  This can be done using either analytically derived derivatives or numerically calculated derivates.

 

4.  Calculate the error matrix E.

 

5.  Using the update formula of the algorithm being used, calculate P1.  If using the Levenberg or Levenberg Marquardt algorithm, calculate the error at P1.  If the new error is worse than the starting error, then increase lambda and repeat until the error decreases.  If the new error is better than the starting error, then decrease lambda.  P1 is the new estimate of the parameters.

 

6.  Set P0 to P1 and repeat the above process until the algorithm either converges or diverges.

Gauss Newton

 Levenberg

Levenberg Marquardt


Jacobian

 

Error