Feature Selection - Correlation Feature Selection

Correlation Feature Selection

The Correlation Feature Selection (CFS) measure evaluates subsets of features on the basis of the following hypothesis: "Good feature subsets contain features highly correlated with the classification, yet uncorrelated to each other". The following equation gives the merit of a feature subset consisting of features:

Here, is the average value of all feature-classification correlations, and is the average value of all feature-feature correlations. The CFS criterion is defined as follows:

CFS = \max_{S_k}
\left[\frac{r_{c f_1}+r_{c f_2}+\cdots+r_{c f_k}}
{\sqrt{k+2(r_{f_1 f_2}+\cdots+r_{f_i f_j}+ \cdots
+ r_{f_k f_1 })}}\right].

The and variables are referred to as correlations, but are not necessarily Pearson's correlation coefficient or Spearman's ρ. Dr. Mark Hall's dissertation uses neither of these, but uses three different measures of relatedness, minimum description length (MDL), symmetrical uncertainty, and relief.

Let be the set membership indicator function for feature ; then the above can be rewritten as an optimization problem:

CFS = \max_{x\in \{0,1\}^{n}}
\left[\frac{(\sum^{n}_{i=1}a_{i}x_{i})^{2}}
{\sum^{n}_{i=1}x_i + \sum_{i\neq j} 2b_{ij} x_i x_j }\right].

The combinatorial problems above are, in fact, mixed 0-1 linear programming problems that can be solved by using branch-and-bound algorithms.

Read more about this topic:  Feature Selection

Famous quotes containing the words feature and/or selection:

    A snake, with mottles rare,
    Surveyed my chamber floor,
    In feature as the worm before,
    But ringed with power.
    Emily Dickinson (1830–1886)

    When you consider the radiance, that it does not withhold
    itself but pours its abundance without selection into every
    nook and cranny
    Archie Randolph Ammons (b. 1926)