VC Dimension - Uses

Uses

The VC dimension has utility in statistical learning theory, because it can predict a probabilistic upper bound on the test error of a classification model.

The bound on the test error of a classification model (on data that is drawn i.i.d. from the same distribution as the training set) is given by

with probability, where is the VC dimension of the classification model, and is the size of the training set (restriction: this formula is valid when ). Similar complexity bounds can be derived using Rademacher complexity, but Rademacher complexity can sometimes provide more insight than VC dimension calculations into such statistical methods such as those using kernels.

In computational geometry, VC dimension is one of the critical parameters in the size of ε-nets, which determines the complexity of approximation algorithms based on them; range sets without finite VC dimension may not have finite ε-nets at all.

Read more about this topic:  VC Dimension