Definition
A random vector x = (X1, …, Xk)' is said to have the multivariate normal distribution if it satisfies the following equivalent conditions.
- Every linear combination of its components Y = a1X1 + … + akXk is normally distributed. That is, for any constant vector a ∈ Rk, the random variable Y = a′x has a univariate normal distribution.
- There exists a random ℓ-vector z, whose components are independent standard normal random variables, a k-vector μ, and a k×ℓ matrix A, such that x = Az + μ. Here ℓ is the rank of the covariance matrix Σ = AA′. Especially in the case of full rank, see the section below on Geometric interpretation.
- There is a k-vector μ and a symmetric, nonnegative-definite k×k matrix Σ, such that the characteristic function of x is
The covariance matrix is allowed to be singular (in which case the corresponding distribution has no density). This case arises frequently in statistics; for example, in the distribution of the vector of residuals in the ordinary least squares regression. Note also that the Xi are in general not independent; they can be seen as the result of applying the matrix A to a collection of independent Gaussian variables z.
Read more about this topic: Multivariate Normal Distribution
Famous quotes containing the word definition:
“Perhaps the best definition of progress would be the continuing efforts of men and women to narrow the gap between the convenience of the powers that be and the unwritten charter.”
—Nadine Gordimer (b. 1923)
“The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.”
—Jean Baudrillard (b. 1929)