Gradient Descent

Gradient descent is a first-order optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient (or of the approximate gradient) of the function at the current point. If instead one takes steps proportional to the positive of the gradient, one approaches a local maximum of that function; the procedure is then known as gradient ascent.

Gradient descent is also known as steepest descent, or the method of steepest descent. When known as the latter, gradient descent should not be confused with the method of steepest descent for approximating integrals.

Read more about Gradient Descent:  Description, Solution of A Linear System, Solution of A Non-linear System, Comments, A Computational Example, Extensions

Famous quotes containing the word descent:

    “There is Hawthorne, with genius so shrinking and rare
    That you hardly at first see the strength that is there;
    A frame so robust, with a nature so sweet,
    So earnest, so graceful, so lithe and so fleet,
    Is worth a descent from Olympus to meet;
    James Russell Lowell (1819–1891)