Gauss Newton, Levenberg, and Levenberg-Marquardt algorithms can be used for the nonlinear least squares regression of parameters for systems of simultaneous equations.

These nonlinear least squares regression algorithms work as follows:

1. Calculate the weight matrix W. The weight matrix is a diagonal matrix of the weights. This matrix has the same number of rows as the error matrix. Each row in the weight matrix corresponds to the same row in the error matrix.

2. Start with initial parameter guess values P_{0}. If using the Levenberg or Levenberg Marquardt algorithm, start with an initial value of lambda.

3. Calculate the Jacobian matrix J. This involves computing the partial first derivatives of the regression functions at the data points with respect to the parameters. This can be done using either analytically derived derivatives or numerically calculated derivates.

4. Calculate the error matrix E.

5. Using the update formula of the algorithm being used, calculate P_{1}. If using the Levenberg or Levenberg Marquardt algorithm, calculate the error at P_{1}. If the new error is worse than the starting error, then increase lambda and repeat until the error decreases. If the new error is better than the starting error, then decrease lambda. P_{1} is the new estimate of the parameters.

6. Set P_{0} to P_{1} and repeat the above process until the algorithm either converges or diverges.

Gauss Newton

Levenberg

Levenberg Marquardt

Jacobian

Error