matrix block multiplication definition, properties and applications

1.6k Views Asked by At

I would like to have a clear definition of matrice block multiplication, its properties and some applications. If possible, some book references.

Suppose we have $A B $ , where $A$ and $B $ are matrices, how could one transform $A $ and $B$ in block matrices that would help do the operation $AB$ equal to a $C $ matrix?

Thanks.

2

There are 2 best solutions below

1
On

Here is what I learned from Chapter 1 of Artin Algebra:

To multiply two matrices $M_{m \times n}$ and $N_{n \times p}$, we can decompose the two matrices into blocks as follow: $$M=\left(\begin{array}{c|c} A& B \end{array} \right), M'=\left(\begin{array}{c} A' \\ \hline B \end{array} \right) \implies MM'=AA'+BB'.$$ where $A$ has $r$ columns and $A'$ has $r$ rows. Generalise this, we have $$ M= \left( \begin{array}{c|c} A & B \\ \hline C & D \end{array} \right), M'=\left( \begin{array}{c|c} A' & B' \\ \hline C' & D' \end{array} \right) \implies MM'= \left( \begin{array}{c|c} AA'+BC' & AB'+BD' \\ \hline CA'+DC' & CB'+DD' \end{array}\right).$$ This looks just like multiplying $2 \times 2$ matrices. In order for this to work, we want to multiply matrices $AA'$ in this order, which implies number of columns of $A$ must equal to number of rows of $A'$. We want to add two matrices $AA'+BC'$, which means number of columns of $A'$ must be equal to number of columns of $C'$. In general, we want: Number of columns of $A$ and $C$ must equal to number of rows of $A'$ and $B'$.

A different block multiplication is that when multiplying $AB$, we can decompose $B$ into column vectors $B=(B_1|B_2| \cdots|B_n)$ then $AB=(AB_1|AB_2| \cdots |AB_n)$. This view is useful in showing that the method of row reduction when solving $AX=B$ works.

Here is a general theorem for block matrix multiplication:

Consider a partition of two matrices $A_{m\times n}$ and $B_{n\times p}$ . First we partition columns of $A$ into $n=n_1+\ldots+n_k$ then we also partition rows of $B$ into $n=n_1+\ldots +n_k$. This is called conformal partitioning of $A$ and $B$. Then for any row partition of $A$, say into $a$ parts, and any column partition of $B$, say into $b$ parts, the product $AB$ can proceed as multiplying $a\times k$ with $k\times b$ matrix.

For a reference and some examples, you can find it here.

1
On

One application is using block decompositions of matrices to solve systems of equations.

If you have the system of equations

$$ Ax=b \tag{1}$$

you can solve it with the $LU$ decomp

$$ \begin{bmatrix} A_{11} & A_{12} \\ A_{21} & A_{22} \end{bmatrix} = \begin{bmatrix} L_{11} & 0 \\ L_{21} & I_{n-r} \end{bmatrix} \begin{bmatrix} I_{r} & 0 \\ 0 & S \end{bmatrix} \begin{bmatrix} U_{11} & U_{12} \\ 0 & I_{n-r} \end{bmatrix} \tag{1} $$

$$\textrm{ Factor } A_{11} = L_{11}U_{11} \tag{2} $$ $$ \textrm{ Solve } L_{11}U_{12} = A_{12} \textrm{ for } U_{12} \tag{3} $$

$$ \textrm{ Solve } L_{21}U_{11} = A_{21} \textrm{ for } L_{21} \tag{4} $$

$$ \textrm{ Form } S = A_{22} - L_{21}U_{12} \tag{5} $$ repeat step $5$ to obtain $L_{22}$ and $U_{22}$