In mathematics, the conjugate gradient method is an algorithm for the numerical solution of particular systems of linear equations, namely those whose matrix is symmetric and positive-definite. The conjugate gradient method is an iterative method, so it can be applied to sparse systems that are too large to be handled by direct methods such as the Cholesky decomposition. Such systems often arise when numerically solving partial differential equations.
The conjugate gradient method can also be used to solve unconstrained optimization problems such as energy minimization. It was developed by Magnus Hestenes and Eduard Stiefel.
The biconjugate gradient method provides a generalization to non-symmetric matrices. Various nonlinear conjugate gradient methods seek minima of nonlinear equations.
Read more about Conjugate Gradient Method: Description of The Method, The Conjugate Gradient Method As A Direct Method, The Conjugate Gradient Method As An Iterative Method, Convergence Properties of The Conjugate Gradient Method, The Preconditioned Conjugate Gradient Method, The Flexible Preconditioned Conjugate Gradient Method, The Conjugate Gradient Method Vs. The Locally Optimal Steepest Descent Method, Derivation of The Method, Conjugate Gradient On The Normal Equations
Famous quotes containing the word method:
“English! they are barbarians; they dont believe in the great God. I told him, Excuse me, Sir. We do believe in God, and in Jesus Christ too. Um, says he, and in the Pope? No. And why? This was a puzzling question in these circumstances.... I thought I would try a method of my own, and very gravely replied, Because we are too far off. A very new argument against the universal infallibility of the Pope.”
—James Boswell (17401795)