Algebraic/Geometric interpretation for product of rectangular matrices.

248 Views Asked by At

Am a novice in the area of vector rep. by matrices. Please vet, and help.

I know just that matrices represent vector(s) in $n$-dimensional plane.
Say, for $1$-D it will represent a point, as $[a_1]$. For a collection of $n$ points, the matrix rep. is: $[a_1\,\,\ a_2\,\,\cdots a_n]$.
For $2$-Dimensional plane will represent vector with coordinates $[a_{11}\,\,\, a_{12}]$. For a collection of $n$ vectors (algebraically, as points), the matrix rep. is $\begin{bmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\ \,\,\,\,\, \,\,\,\,\,\,\,\vdots\\ a_{n1} & a_{n2}\\ \end{bmatrix} $

My interpretation of the above is that : there are $n$ vectors (points) with each having $x$ coordinate as $a_{i1}$, $y$ coordinate as $a_{i2}$, for $i$-th vector(row - for $i$-th point).

Similarly, for any general matrix of size $m*n$. Geometrically, it is viewed as a list of vectors, or algebraically as a set of equations.

If number of rows ($r$) are less than the number of columns ($c$); then algebraically the number of equations ($e$) is less than the number of unknowns (variables)($v$). Similarly, for the opposite sitn., if number of rows ($r(=e)$) is more than the number of columns ($c(=v)$), the interpretation is that $r(=e)\gt c(=v)$; leading to a unique solution, if number of independent equations is more than $c(=v)$.

The geometrical significance depends on actual number of independent equations & the dimension of the plane. If the plane is $n$ dimensional; then there cannot be more than $n$ independent equations.

But, in matrices the product of two rectangular matrices is quite possible; & may lead to a square or rectangular matrix. Below is a square matrix formed out of two rectangular matrices' product.

$A = \begin{bmatrix} a_{11} & a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23} \\ \end{bmatrix} $, $B = \begin{bmatrix} b_{11} & b_{12}\\ b_{21} & b_{22}\\ b_{31} & b_{32} \end{bmatrix} $, $AB = \begin{bmatrix} a_{11}b_{11} + a_{12}b_{21} + a_{13}b_{31} & a_{11}b_{12} + a_{12}b_{22} + a_{13}b_{32}\\ a_{21}b_{11} + a_{22}b_{21} + a_{23}b_{31} & a_{21}b_{12} + a_{22}b_{22} + a_{23}b_{32}\\ \end{bmatrix}$

So, if have two matrices, as above: $A=2*3, B=3*2$. Then for $A$ there is one lesser equation available for set of equations (vectors) than the number of unknowns (dimensions of the plane) to solve. Opposite is the case for matrix $B$, with any of the three equations (vector) as linear-combination (dependent on the other two) of the other two.
But, how can interpret the product- either algebraically or geometrically?
Particularly, if the product yields a rectangular matrix. Say, the two matrices are : $A',B'$ of sizes: $A'= 2*3, B'=3*1, C'=A'B'= 2*1$.

Would request any reference source for the same.

1

There are 1 best solutions below

4
On BEST ANSWER

A $m \times n$ matrix represents a linear map from a vector space of dimension $n$ to a vector space of dimension $m$. If we represent a vector in a vector space of dimension $n$ by a $n \times 1$ column matrix (this is relative to a given basis), then we apply the linear map by multiplying the column vector by the matrix.

For example, we can define a linear map $A: \mathbb{R}^3 \rightarrow \mathbb{R}^2$ by

$A(B) = A'B'$

where $A'$ is a $2 \times 3$ matrix and the $3 \times 1$ column matrix $B'$ represents a vector $B$ in $\mathbb{R}^3$. $A'B'$ is, as we expect, a $2 \times 1$ column matrix which represents a vector in $\mathbb{R}^2$.

Informally, each column of $A'$ represents the image in $\mathbb{R}^2$ of one of the basis vectors in $\mathbb{R}^3$.

Multiplication of matrices is the same as composition of linear maps. For example, if we have a liner map $X: \mathbb{R}^2 \rightarrow \mathbb{R}^4$ which we represent by a $4 \times2$ matrix $X'$ then we can compose this with the map $A: \mathbb{R}^3 \rightarrow \mathbb{R}^2$ to create a new linear map $Y: \mathbb{R}^3 \rightarrow \mathbb{R}^4$ defined by:

$Y(B) = X(A(B)) = (X \circ A)(B)$

and $Y$ is represented by the $4 \times 3$ matrix $Y'$ where

$Y' = X'A'$