Multivariate Analysis of Variance

Multivariate Analysis Of Variance

Multivariate analysis of variance (MANOVA) is a statistical test procedure for comparing multivariate (population) means of several groups. Unlike ANOVA, it uses the variance-covariance between variables in testing the statistical significance of the mean differences.

It is a generalized form of univariate analysis of variance (ANOVA). It is used when there are two or more dependent variables. It helps to answer : 1. do changes in the independent variable(s) have significant effects on the dependent variables; 2. what are the interactions among the dependent variables and 3. among the independent variables. Essentially, MANOVA takes scores from the multiple dependent variable and creates a single dependent variable giving the ability to test for the above effects. Statistical reports however will provide individual p-values for each dependent variable, indicating whether differences and interactions are statistically significant.

Where sums of squares appear in univariate analysis of variance, in multivariate analysis of variance certain positive-definite matrices appear. The diagonal entries are the same kinds of sums of squares that appear in univariate ANOVA. The off-diagonal entries are corresponding sums of products. Under normality assumptions about error distributions, the counterpart of the sum of squares due to error has a Wishart distribution.

Analogous to ANOVA, MANOVA is based on the product of model variance matrix, and inverse of the error variance matrix, or . The hypothesis that implies that the product . Invariance considerations imply the MANOVA statistic should be a measure of magnitude of the singular value decomposition of this matrix product, but there is no unique choice owing to the multi-dimensional nature of the alternative hypothesis.

The most common statistics are summaries based on the roots (or eigenvalues) of the matrix:

  • Samuel Stanley Wilks' distributed as lambda (Λ)
  • the Pillai-M. S. Bartlett trace,
  • the Lawley-Hotelling trace,
  • Roy's greatest root (also called Roy's largest root),

Discussion continues over the merits of each, though the greatest root leads only to a bound on significance which is not generally of practical interest. A further complication is that the distribution of these statistics under the null hypothesis is not straightforward and can only be approximated except in a few low-dimensional cases. The best-known approximation for Wilks' lambda was derived by C. R. Rao.

In the case of two groups, all the statistics are equivalent and the test reduces to Hotelling's T-square.

Read more about Multivariate Analysis Of Variance:  Correlation of Dependent Variables

Famous quotes containing the words analysis and/or variance:

    The spider-mind acquires a faculty of memory, and, with it, a singular skill of analysis and synthesis, taking apart and putting together in different relations the meshes of its trap. Man had in the beginning no power of analysis or synthesis approaching that of the spider, or even of the honey-bee; but he had acute sensibility to the higher forces.
    Henry Brooks Adams (1838–1918)

    There is an untroubled harmony in everything, a full consonance in nature; only in our illusory freedom do we feel at variance with it.
    Fyodor Tyutchev (1803–1873)