Linear combination of vectors vs Dot product in Matrix vector multiplication

2k Views Asked by At

Consider the operation:

$\begin{bmatrix}2 & 0\\1 & 2\end{bmatrix}$ $\begin{bmatrix}1\\1\end{bmatrix}$

Here, the matrix encodes the transformation which leads the basis vectors i, j to $\begin{bmatrix}2\\1\end{bmatrix}$ and $\begin{bmatrix}0\\2\end{bmatrix}$ respectively. So as per the definition of vectors, the vector$\begin{bmatrix}1\\1\end{bmatrix}$ is 1 unit away from i and 1 unit away from j. After the transformation, it will be at $\begin{bmatrix}2\\3\end{bmatrix}$

The matrix vector multiplication when expressed as such a linear combination, is very intuitive. However, the same result vector is obtained when we take the dot product of row vectors of the matrix with the column vector. Why is this possible ? Clearly, [2 0] are not together, they belong to different vectors. But still, doing that works. Is it some number magic ? or am I missing a geometrical perspective ?

EDIT

Looking at FoobazJohn's answer, there seems to be some relation between them. What is it ?

1

There are 1 best solutions below

0
On BEST ANSWER

You've got the right idea: the matrix changes basis, but it's from the basis consisting of its columns to the standard basis...

As to why the dot products of the rows with the column vectors does this, there is no real mystery: it is precisely the definition of matrix multiplication...

As for connections between columns and rows of a matrix, there certainly are amazing ones... For instance, that the column rank equals the row rank...