History
The heuristic approach of self-training (also known as self-learning or self-labeling) is historically the oldest approach to semi-supervised learning, with examples of applications starting in the 1960s (see for instance Scudder (1965)).
The transductive learning framework was formally introduced by Vladimir Vapnik in the 1970s. Interest in inductive learning using generative models also began in the 1970s. A probably approximately correct learning bound for semi-supervised learning of a Gaussian mixture was demonstrated by Ratsaby and Venkatesh in 1995
Semi-supervised learning has recently become more popular and practically relevant due to the variety of problems for which vast quantities of unlabeled data are available—e.g. text on websites, protein sequences, or images. For a review of recent work see a survey article by Zhu (2008).
Read more about this topic: Semi-supervised Learning
Famous quotes containing the word history:
“In nature, all is useful, all is beautiful. It is therefore beautiful, because it is alive, moving, reproductive; it is therefore useful, because it is symmetrical and fair. Beauty will not come at the call of a legislature, nor will it repeat in England or America its history in Greece. It will come, as always, unannounced, and spring up between the feet of brave and earnest men.”
—Ralph Waldo Emerson (18031882)
“Every literary critic believes he will outwit history and have the last word.”
—Mason Cooley (b. 1927)
“Gossip is charming! History is merely gossip. But scandal is gossip made tedious by morality.”
—Oscar Wilde (18541900)