Stochastic gradient descent is a gradient descent optimization method for minimizing an objective function that is written as a sum of differentiable functions.
Read more about Stochastic Gradient Descent: Background, Iterative Method, Example, Applications
Famous quotes containing the word descent:
“In the world of the celebrity, the hierarchy of publicity has replaced the hierarchy of descent and even of great wealth.”
—C. Wright Mills (19161962)