BFGS Method

BFGS Method

In numerical optimization, the Broyden–Fletcher–Goldfarb–Shanno (BFGS) method is a method for solving nonlinear optimization problems (which lack constraints).

The BFGS method approximates Newton's method, a class of hill-climbing optimization techniques that seeks a stationary point of a (preferably twice continuously differentiable) function: For such problems, a necessary condition for optimality is that the gradient be zero. Newton's method and the BFGS methods not need to converge unless the function has a quadratic Taylor expansion near an optimum. These methods use the first and second derivatives. However, BFGS has proven good performance even for non-smooth optimizations.

In quasi-Newton methods, the Hessian matrix of second derivatives doesn't need to be evaluated directly. Instead, the Hessian matrix is approximated using rank-one updates specified by gradient evaluations (or approximate gradient evaluations). Quasi-Newton methods are a generalization of the secant method to find the root of the first derivative for multidimensional problems. In multi-dimensions the secant equation does not specify a unique solution, and quasi-Newton methods differ in how they constrain the solution. The BFGS method is one of the most popular members of this class. Also in common use is L-BFGS, which is a limited-memory version of BFGS that is particularly suited to problems with very large numbers of variables (like >1000). The BFGS-B variant handles simple box constraints.

Read more about BFGS Method:  Rationale, Algorithm, Implementations

Famous quotes containing the word method:

    It is to be lamented that the principle of national has had very little nourishment in our country, and, instead, has given place to sectional or state partialities. What more promising method for remedying this defect than by uniting American women of every state and every section in a common effort for our whole country.
    Catherine E. Beecher (1800–1878)