In machine learning, early stopping is a form of regularization used when a machine learning model (such as a neural network) is trained by on-line gradient descent. In early stopping, the training set is split into a new training set and a validation set. Gradient descent is applied to the new training set. After each sweep through the new training set, the network is evaluated on the validation set. When the performance with the validation test stops improving, the algorithm halts. The network with the best performance on the validation set is then used for actual testing, with a separate set of data (the validation set is used in learning to decide when to stop).
This technique is a simple but efficient hack to deal with the problem of overfitting. Overfitting is a phenomenon in which a learning system, such as a neural network gets very good at dealing with one data set at the expense of becoming very bad at dealing with other data sets. Early stopping is effectively limiting the used weights in the network and thus imposes a regularization, effectively lowering the VC dimension.
Early stopping is a very common practice in neural network training and often produces networks that generalize well. However, while often improving the generalization it does not do so in a mathematically well-defined way.
Read more about Early Stopping: Method, Advantages, Issues
Famous quotes containing the words early and/or stopping:
“We have been told over and over about the importance of bonding to our children. Rarely do we hear about the skill of letting go, or, as one parent said, that we raise our children to leave us. Early childhood, as our kids gain skills and eagerly want some distance from us, is a time to build a kind of adult-child balance which permits both of us room.”
—Joan Sheingold Ditzion (20th century)
“A bill... is the most extraordinary locomotive engine that the genius of man ever produced. It would keep on running during the longest lifetime, without ever once stopping of its own accord.”
—Charles Dickens (18121870)