Rank (linear Algebra) - Column Rank = Row Rank or Rk(A) = Rk(AT)

Column Rank = Row Rank or Rk(A) = Rk(AT)

This result forms a very important part of the fundamental theorem of linear algebra. We present two proofs of this result. The first is short and uses only basic properties of linear combination of vectors. The second is an elegant argument using orthogonality and is based upon: Mackiw, G. (1995). A Note on the Equality of the Column and Row Rank of a Matrix. Mathematics Magazine, Vol. 68, No. 4. Interestingly, the first proof begins with a basis for the column space, while the second builds from a basis for the row space. The first proof is valid when the matrices are defined over any field of scalars, while the second proof works only on inner-product spaces. Of course they both work for real and complex euclidean spaces. Also, the proofs are easily adapted when A is a linear transformation.

First proof: Let A be an m × n matrix whose column rank is r. Therefore, the dimension of the column space of A is r. Let be any basis for the column space of A and place them as column vectors to form the m × r matrix . Therefore, each column vector of A is a linear combination of the r columns of C. From the definition of matrix multiplication, there exists an r × n matrix R, such that A = CR. (The -th element of R is the coefficient of when the j-th column of A is expressed as a linear combination of the r columns of C. Also see rank factorization.) Now, since A = CR, every row vector of A is a linear combination of the row vectors of R. (The -th element of C is the coefficient of the j-th row vector of R when the i-th row of A is expressed as a linear combination of the r rows of R.) This means that the row space of A is contained within the row space of R. Therefore, we have row rank of A ≤ row rank of R. But note that R has r rows, so the row rank of Rr = column rank of A. This proves that row rank of A ≤ column rank of A. Now apply the result to the transpose of A to get the reverse inequality: column rank of A = row rank of AT ≤ column rank of AT = row rank of A. This proves column rank of A equals row rank of A. See a very similar but more direct proof for rk(A) = rk(AT) under rank factorization. QED

Second proof: Let A be an m × n matrix whose row rank is r. Therefore, the dimension of the row space of A is r and suppose that is a basis of the row space of A. We claim that the vectors are linearly independent. To see why, consider the linear homogeneous relation involving these vectors with scalar coefficients :

where . We make two observations: (a) v is a linear combination of vectors in the row space of A, which implies that v belongs to the row space of A, and (b) since Av = 0, v is orthogonal to every row vector of A and, hence, is orthogonal to every vector in the row space of A. The facts (a) and (b) together imply that v is orthogonal to itself, which proves that v = 0 or, by the definition of v:
But recall that the xi are linearly independent because they are a basis of the row space of A. This implies that, which proves our claim that are linearly independent. Now, each Axi is obviously a vector in the column space of A. So, is a set of r linearly independent vectors in the column space of A and, hence, the dimension of the column space of A (i.e. the column rank of A) must be at least as big as r. This proves that row rank of A = r ≤ column rank of A. Now apply this result to the transpose of A to get the reverse inequality: column rank of A = row rank of AT ≤ column rank of AT = row rank of A. This proves column rank of A equals row rank of A or, equivalently, rk(A) = rk(AT). QED.

Finally, we provide a proof of the related result, rk(A) = rk(A*), where A* is the conjugate transpose or hermitian transpose of A. When the elements of A are real numbers, this result becomes rk(A) = rk(AT) and can constitute another proof for row rank = column rank. Otherwise, for complex matrices, rk(A) = rk(A*) is not equivalent to row rank = column rank, and one of the above two proofs should be used. This proof is short, elegant and makes use of the null space.

Third proof: Let A be an m × n matrix. Define rk(A) to mean the column rank of A. First note that A*Ax = 0 if and only if Ax = 0. This is elementary linear algebra – one direction is trivial; the other follows from:

where is the Euclidean norm. This proves that the null space of A is equal to the null space of A*A. From the rank–nullity theorem, we obtain rk(A) = rk(A*A). (Alternate argument: Since A*Ax = 0 if and only if Ax = 0, the columns of A*A satisfy the same linear relationships as the columns of A. In particular, they must have the same number of linearly independent columns and, hence, the same column rank.) Each column of A*A is a linear combination of the columns of A*. Therefore, the column space of A*A is a subspace of the column space of A*. This implies that rk(A*A) ≤ rk(A*). We have proved: rk(A) = rk(A*A) ≤ rk(A*). Now apply this result to A* to obtain the reverse inequality: since (A*)* = A, we can write rk(A*) ≤ rk((A*)*) = rk(A). This proves rk(A) = rk(A*). When the elements of A are real, the conjugate transpose is the transpose and we obtain rk(A) = rk(AT). QED.

Read more about this topic:  Rank (linear Algebra)

Famous quotes containing the words column, rank and/or row:

    When other helpers fail and comforts flee, when the senses decay and the mind moves in a narrower and narrower circle, when the grasshopper is a burden and the postman brings no letters, and even the Royal Family is no longer quite what it was, an obituary column stands fast.
    Sylvia Townsend Warner (1893–1978)

    I will not choose what many men desire,
    Because I will not jump with common spirits,
    And rank me with the barbarous multitudes.
    William Shakespeare (1564–1616)

    all afternoon
    Their witless offspring flock like piped rats to its siren
    Crescendo, and agape on the crumbling ridge
    Stand in a row and learn.
    William Stanley Merwin (b. 1927)