Feature Selection - Correlation Feature Selection

Correlation Feature Selection

The Correlation Feature Selection (CFS) measure evaluates subsets of features on the basis of the following hypothesis: "Good feature subsets contain features highly correlated with the classification, yet uncorrelated to each other". The following equation gives the merit of a feature subset consisting of features:

Here, is the average value of all feature-classification correlations, and is the average value of all feature-feature correlations. The CFS criterion is defined as follows:

CFS = \max_{S_k}
\left[\frac{r_{c f_1}+r_{c f_2}+\cdots+r_{c f_k}}
{\sqrt{k+2(r_{f_1 f_2}+\cdots+r_{f_i f_j}+ \cdots
+ r_{f_k f_1 })}}\right].

The and variables are referred to as correlations, but are not necessarily Pearson's correlation coefficient or Spearman's ρ. Dr. Mark Hall's dissertation uses neither of these, but uses three different measures of relatedness, minimum description length (MDL), symmetrical uncertainty, and relief.

Let be the set membership indicator function for feature ; then the above can be rewritten as an optimization problem:

CFS = \max_{x\in \{0,1\}^{n}}
\left[\frac{(\sum^{n}_{i=1}a_{i}x_{i})^{2}}
{\sum^{n}_{i=1}x_i + \sum_{i\neq j} 2b_{ij} x_i x_j }\right].

The combinatorial problems above are, in fact, mixed 0-1 linear programming problems that can be solved by using branch-and-bound algorithms.

Read more about this topic:  Feature Selection

Famous quotes containing the words feature and/or selection:

    With men he can be rational and unaffected, but when he has ladies to please, every feature works.
    Jane Austen (1775–1817)

    It is the highest and most legitimate pride of an Englishman to have the letters M.P. written after his name. No selection from the alphabet, no doctorship, no fellowship, be it of ever so learned or royal a society, no knightship,—not though it be of the Garter,—confers so fair an honour.
    Anthony Trollope (1815–1882)