The Conjugate Gradient Method Vs. The Locally Optimal Steepest Descent Method
In both the original and the preconditioned conjugate gradient methods one only needs to always set in order to turn them into locally optimal, using the line search, steepest descent methods. With this substitution, vectors are always the same as vectors, so there is no need to store vectors . Thus, every iteration of these steepest descent methods is a bit cheaper compared to that for the conjugate gradient methods. However, the latter converge faster, unless a (highly) variable preconditioner is used, see above.
Read more about this topic: Conjugate Gradient Method
Famous quotes containing the words method, locally, optimal and/or descent:
“Government by average opinion is merely a circuitous method of going to the devil; those who profess to lead but in fact slavishly follow this average opinion are simply the fastest runners and the loudest squeakers of the herd which is rushing blindly down to its destruction.”
—Thomas Henry Huxley (182595)
“To see ourselves as others see us can be eye-opening. To see others as sharing a nature with ourselves is the merest decency. But it is from the far more difficult achievement of seeing ourselves amongst others, as a local example of the forms human life has locally taken, a case among cases, a world among worlds, that the largeness of mind, without which objectivity is self- congratulation and tolerance a sham, comes.”
—Clifford Geertz (b. 1926)
“It is the child in man that is the source of his uniqueness and creativeness, and the playground is the optimal milieu for the unfolding of his capacities and talents.”
—Eric Hoffer (19021983)
“Genealogy. An account of ones descent from an ancestor who did not particularly care to trace his own.”
—Ambrose Bierce (18421914)