Go to the first, previous, next, last section, table of contents.

Algorithms without Derivatives

The algorithms described in this section do not require any derivative information to be supplied by the user. Any derivatives needed are approximated from by finite difference.

Solver: gsl_multiroot_fsolver_hybrids
This is a version of the Hybrid algorithm which replaces calls to the Jacobian function by its finite difference approximation. The finite difference approximation is computed using gsl_multiroots_fdjac with a relative step size of GSL_SQRT_DBL_EPSILON.

Solver: gsl_multiroot_fsolver_hybrid
This is a finite difference version of the Hybrid algorithm without internal scaling.

Solver: gsl_multiroot_fsolver_dnewton

The discrete Newton algorithm is the simplest method of solving a multidimensional system. It uses the Newton iteration

where the Jacobian matrix @math{J} is approximated by taking finite differences of the function f. The approximation scheme used by this implementation is,

where @math{\delta_j} is a step of size @math{\sqrt\epsilon |x_j|} with @math{\epsilon} being the machine precision (@c{$\epsilon \approx 2.22 \times 10^{-16}$} @math{\epsilon \approx 2.22 \times 10^-16}). The order of convergence of Newton's algorithm is quadratic, but the finite differences require @math{n^2} function evaluations on each iteration. The algorithm may become unstable if the finite differences are not a good approximation to the true derivatives.

Solver: gsl_multiroot_fsolver_broyden

The Broyden algorithm is a version of the discrete Newton algorithm which attempts to avoids the expensive update of the Jacobian matrix on each iteration. The changes to the Jacobian are also approximated, using a rank-1 update,

where the vectors @math{dx} and @math{df} are the changes in @math{x} and @math{f}. On the first iteration the inverse Jacobian is estimated using finite differences, as in the discrete Newton algorithm. This approximation gives a fast update but is unreliable if the changes are not small, and the estimate of the inverse Jacobian becomes worse as time passes. The algorithm has a tendency to become unstable unless it starts close to the root. The Jacobian is refreshed if this instability is detected (consult the source for details).

This algorithm is not recommended and is included only for demonstration purposes.


Go to the first, previous, next, last section, table of contents.