Matrix Multiplication - The Inner and Outer Products

The Inner and Outer Products

Given two column vectors a and b, the Euclidean inner product and outer product are the simplest special cases of the matrix product, by transposing the column vectors into row vectors.

The inner product

is a column vector multiplied on the left by a row vector:

More explicitly,

\mathbf{a}\cdot \mathbf{b} =
\begin{pmatrix}a_1 & a_2 & \cdots & a_n\end{pmatrix}
\begin{pmatrix}b_1 \\ b_2 \\ \vdots \\ b_n\end{pmatrix}
= a_1b_1+a_2b_2+\cdots+a_nb_n = \sum_{i=1}^n a_ib_i
The outer product

is a row vector multiplied on the left by a column vector:

where

\mathbf{a}\mathbf{b}^\mathrm{T} = \begin{pmatrix}a_1 \\ a_2 \\ \vdots \\ a_n\end{pmatrix}
\begin{pmatrix}b_1 & b_2 & \cdots & b_n\end{pmatrix}
= \begin{pmatrix}
a_1 b_1 & a_1 b_2 & \cdots & a_1 b_n \\
a_2 b_1 & a_2 b_2 & \cdots & a_2 b_n \\
\vdots & \vdots & \ddots & \vdots \\
a_n b_1 & a_n b_2 & \cdots & a_n b_n \\
\end{pmatrix}.


Matrix product (in terms of inner product)

Suppose that the first n×m matrix A is decomposed into its row vectors ai, and the second m×p matrix B into its column vectors bi:

\mathbf{A} =
\begin{pmatrix}
A_{1 1} & A_{1 2} & \cdots & A_{1 m} \\
A_{2 1} & A_{2 2} & \cdots & A_{2 m} \\
\vdots & \vdots & \ddots & \vdots \\
A_{n 1} & A_{n 2} & \cdots & A_{n m}
\end{pmatrix} = \begin{pmatrix}
\mathbf{a}_1 \\ \mathbf{a}_2 \\ \vdots \\ \mathbf{a}_n
\end{pmatrix},\quad \mathbf{B} = \begin{pmatrix}
B_{1 1} & B_{1 2} & \cdots & B_{1 p} \\
B_{2 1} & B_{2 2} & \cdots & B_{2 p} \\
\vdots & \vdots & \ddots & \vdots \\
B_{m 1} & B_{m 2} & \cdots & B_{m p}
\end{pmatrix}
=
\begin{pmatrix}
\mathbf{b}_1 & \mathbf{b}_2 & \cdots & \mathbf{b}_p
\end{pmatrix}

where

The entries in the introduction were given by:


\mathbf{AB} =
\begin{pmatrix}
\mathbf{a}_1 \\
\mathbf{a}_2 \\
\vdots \\
\mathbf{a}_n
\end{pmatrix} \begin{pmatrix} \mathbf{b}_1 & \mathbf{b}_2 & \dots & \mathbf{b}_p
\end{pmatrix} = \begin{pmatrix}
(\mathbf{a}_1 \cdot \mathbf{b}_1) & (\mathbf{a}_1 \cdot \mathbf{b}_2) & \dots & (\mathbf{a}_1 \cdot \mathbf{b}_p) \\
(\mathbf{a}_2 \cdot \mathbf{b}_1) & (\mathbf{a}_2 \cdot \mathbf{b}_2) & \dots & (\mathbf{a}_2 \cdot \mathbf{b}_p) \\
\vdots & \vdots & \ddots & \vdots \\
(\mathbf{a}_n \cdot \mathbf{b}_1) & (\mathbf{a}_n \cdot \mathbf{b}_2) & \dots & (\mathbf{a}_n \cdot \mathbf{b}_p)
\end{pmatrix}


It is also possible to express a matrix product in terms of concatenations of products of matrices and row or column vectors:


\mathbf{AB} =
\begin{pmatrix}
\mathbf{a}_1 \\
\mathbf{a}_2 \\
\vdots \\
\mathbf{a}_n
\end{pmatrix} \begin{pmatrix} \mathbf{b}_1 & \mathbf{b}_2 & \dots & \mathbf{b}_p
\end{pmatrix} = \begin{pmatrix}
\mathbf{A}\mathbf{b}_1 & \mathbf{A}\mathbf{b}_2 & \dots & \mathbf{A}\mathbf{b}_p
\end{pmatrix} = \begin{pmatrix}
\mathbf{a}_1\mathbf{B} \\
\mathbf{a}_2\mathbf{B}\\
\vdots\\
\mathbf{a}_n\mathbf{B}
\end{pmatrix}

These decompositions are particularly useful for matrices that are envisioned as concatenations of particular types of row vectors or column vectors, e.g. orthogonal matrices (whose rows and columns are unit vectors orthogonal to each other) and Markov matrices (whose rows or columns sum to 1).

Matrix product (in terms of outer product)

An alternative method results when the decomposition is done the other way around, i.e. the first matrix A is decomposed into column vectors and the second matrix B into row vectors :


\mathbf{AB} =
\begin{pmatrix} \mathbf{\bar a}_1 & \mathbf{\bar a}_2 & \cdots & \mathbf{\bar a}_m \end{pmatrix}
\begin{pmatrix} \mathbf{\bar b}_1 \\ \mathbf{\bar b}_2 \\ \vdots \\ \mathbf{\bar b}_m \end{pmatrix}
= \mathbf{\bar a}_1 \otimes \mathbf{\bar b}_1 + \mathbf{\bar a}_2 \otimes \mathbf{\bar b}_2 + \cdots + \mathbf{\bar a}_m \otimes \mathbf{\bar b}_m = \sum_{i=1}^m \mathbf{\bar a}_i \otimes \mathbf{\bar b}_i

where this time

This method emphasizes the effect of individual column/row pairs on the result, which is a useful point of view with e.g. covariance matrices, where each such pair corresponds to the effect of a single sample point.

Read more about this topic:  Matrix Multiplication

Famous quotes containing the words outer and/or products:

    The guarantee that our self enjoys an intended relation to the outer world is most, if not all, we ask from religion. God is the self projected onto reality by our natural and necessary optimism. He is the not-me personified.
    John Updike (b. 1932)

    It seemed there was a sort of poisoning, an auto-infection of the organisms, so Dr. Krokowski said; it was caused by the disintegration of a substance ... and the products of this disintegration operated like an intoxicant upon the nerve-centres of the spinal cord, with an effect similar to that of certain poisons, such as morphia, or cocaine.
    Thomas Mann (1875–1955)