Context mixing is a type of data compression algorithm in which the next-symbol predictions of two or more statistical models are combined to yield a prediction that is often more accurate than any of the individual predictions. For example, one simple method (not necessarily the best) is to average the probabilities assigned by each model. The random forest is another method: it outputs the prediction that is the mode of the predictions output by individual models. Combining models is an active area of research in machine learning.
The PAQ series of data compression programs use context mixing to assign probabilities to individual bits of the input.
Read more about Context Mixing: Application To Data Compression
Famous quotes containing the words context and/or mixing:
“The hard truth is that what may be acceptable in elite culture may not be acceptable in mass culture, that tastes which pose only innocent ethical issues as the property of a minority become corrupting when they become more established. Taste is context, and the context has changed.”
—Susan Sontag (b. 1933)
“Give me Catholicism every time. Father Cheeryble with his thurible; Father Chatterjee with his liturgy. What fun they have with all their charades and conundrums! If it werent for the Christianity they insist on mixing in with it, Id be converted tomorrow.”
—Aldous Huxley (18941963)