The Conjugate Gradient Method Vs. The Locally Optimal Steepest Descent Method
In both the original and the preconditioned conjugate gradient methods one only needs to always set in order to turn them into locally optimal, using the line search, steepest descent methods. With this substitution, vectors are always the same as vectors, so there is no need to store vectors . Thus, every iteration of these steepest descent methods is a bit cheaper compared to that for the conjugate gradient methods. However, the latter converge faster, unless a (highly) variable preconditioner is used, see above.
Read more about this topic: Conjugate Gradient Method
Famous quotes containing the words method, locally, optimal and/or descent:
“Relying on any one disciplinary approachtime-out, negotiation, tough love, the star systemputs the parenting team at risk. Why? Because children adapt to any method very quickly; todays effective technique becomes tomorrows worn dance.”
—Ron Taffel (20th century)
“To see ourselves as others see us can be eye-opening. To see others as sharing a nature with ourselves is the merest decency. But it is from the far more difficult achievement of seeing ourselves amongst others, as a local example of the forms human life has locally taken, a case among cases, a world among worlds, that the largeness of mind, without which objectivity is self- congratulation and tolerance a sham, comes.”
—Clifford Geertz (b. 1926)
“It is the child in man that is the source of his uniqueness and creativeness, and the playground is the optimal milieu for the unfolding of his capacities and talents.”
—Eric Hoffer (19021983)
“In the world of the celebrity, the hierarchy of publicity has replaced the hierarchy of descent and even of great wealth.”
—C. Wright Mills (19161962)