Is my understanding of matrices correct?

132 Views Asked by At

Let $V = R^2$ be our vector space with the unit base vectors $J(1, 0), K(0, 1)$. We have the linear map,

$$T: V \to V$$

We can rewrite $\forall v \in V$ as a linear combination of the unit base vectors (by their very definition) and utilize the linearity of our map:

$$v = xJ + yK = \begin{bmatrix}x \\ y\end{bmatrix}$$ $$T(v) = T(xJ + yK) = xT(J) +yT(K)$$

What does this mean? This means that all linear transformations can be uniquely described by the transformed unit base vectors $T_J, T_K$. So we can notate our map in the matrix form (some kind of vector consisting of column vectors),

$$T = \begin{bmatrix}T_J\\T_K\end{bmatrix}$$

And we can define matrix-vector multiplication as the dot product of them:

$$\begin{bmatrix}T_J\\T_K\end{bmatrix} \cdot \begin{bmatrix}x\\y\end{bmatrix} = xT_J + yT_K = T(v).$$

But if we directly notate the $T_J, T_K$ as vectors rather than hiding them behind letters, it would be more convenient to write our matrix in row form:

$$T = \begin{bmatrix}T_J \space T_K\end{bmatrix} = \begin{bmatrix} \begin{bmatrix}X_0 \\ X_1\end{bmatrix} \begin{bmatrix}Y_0 \\ Y_1\end{bmatrix}\end{bmatrix} \Rightarrow \begin{bmatrix}X_0 & Y_0 \\ X_1 & Y_1\end{bmatrix}$$

Also this form is more useful for representing our linear transformation / matrix as a set of linear equations.

Is my understanding correct?

2

There are 2 best solutions below

0
On BEST ANSWER

Note that in the context of linear algebra, "dot product" is just an ordinary matrix multiplication. If you are taking the dot product of vectors $u$ and $v$ you have to transpose one of them to make this work, so you you get the matrix product $u^Tv.$ That is, if

$$ u = \begin{bmatrix}a \\ b\end{bmatrix} \quad\text{and}\quad v = \begin{bmatrix}x \\ y\end{bmatrix}$$

then the "dot product" of $u$ and $v$ is

$$ \begin{bmatrix}a & b\end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = xa + yb. $$

So if you want to get the result $xT_J + yT_K,$ with $T_J$ in place of $a$ and $T_K$ in place of $b,$ then a näive interpretation of the linear-algebra "dot product" would lead you to write

$$ \begin{bmatrix}T_J & T_K\end{bmatrix} \begin{bmatrix}x \\ y\end{bmatrix} = xT_J + yT_K. $$

Of course this is all just a terrible abuse of notation until you decide that it means you use $T_J$ and $T_K$ to populate the columns of a matrix, and you are able to prove that this always gives the desired result. And hence we arrive at the final matrix in your question, for which the other answer gives a notation that is a little more explicit about how to construct it.

What is mainly different from the vector "dot product" here is that we don't tend to interpret the matrix on the left as the transpose of something else. It is simply what it is.

0
On

Sometimes it is useful to write "matrices of vectors" that operate under the usual rules for matrix multiplication, as you try to do in your second-to-last equation. But this is very non-standard notation and I would avoid it if you're just starting out with linear algebra. As you say, you can represent $T$ by the image $T_J$ and $T_K$, in which case the representation of $T$ in coordinates is as a collection of column vectors: $$T = \left[\begin{array}{c|c} & \\ T_J & T_K \\ & \end{array}\right]$$ With $Tv$ taking linear combinations of the two columns.