Stochastic Gradient Descent

Stochastic gradient descent is a gradient descent optimization method for minimizing an objective function that is written as a sum of differentiable functions.

Read more about Stochastic Gradient Descent:  Background, Iterative Method, Example, Applications

Famous quotes containing the word descent:

    “There is Hawthorne, with genius so shrinking and rare
    That you hardly at first see the strength that is there;
    A frame so robust, with a nature so sweet,
    So earnest, so graceful, so lithe and so fleet,
    Is worth a descent from Olympus to meet;
    James Russell Lowell (1819–1891)