Linear Transformations and Left-multiplication Matrix

1.8k Views Asked by At

Reading Friedberg, Ingel, and Spence's Linear Algebra (4th Edition) and I believe I've found a typo, but I want to make sure there is not some subtlety I'm not getting.

This is a simple true or false (Exercise 1(g) in Section 2.3) which I believe is True, but the back of the book says False.

Let $\mathsf{V},\mathsf{W}$ denote finite vector spaces and $\mathsf{V} \xrightarrow{\mathsf{T}} \mathsf{W}$ be a linear transformation and let $\mathsf{L}_A$ denote left-multiplication by $A.$

True or False? $\mathsf{T} = \mathsf{L}_A$ for some matrix $A.$

If this is false there is a theorem earlier that seems to contradict that by stating there is such a unique matrix.

Before I commit this to memory I just want to be certain.

1

There are 1 best solutions below

7
On BEST ANSWER

First, remember that vector spaces are abstract constructs; they need not be "made up" of tuples. Some vector spaces, specifically those of the form $\mathbb{F}^n$ (where $\mathbb{F}$ is a field) are made up of tuples, but not all vector spaces are. You have vector spaces of polynomials (either all of them, or up to a certain degree), of matrices, of functions (continuous, differentiable), of sequences, etc.

Now, if you are working with vector spaces of tuples (in this case, written as columns), say $\mathsf{V}=\mathbb{F}^n$, $\mathsf{W}=\mathbb{F}^m$, then:

  1. Every $m\times n$ matrix $A$ determines a linear transformation $L_A\colon \mathsf{V}\to\mathsf{W}$ by "left multiplication by $A$", namely $$L_A\left(\begin{array}{c}a_1\\\vdots\\a_n\end{array}\right) = A\left(\begin{array}{c}a_1\\\vdots\\a_n\end{array}\right).$$

  2. Given an arbitrary linear transformation $T\colon\mathsf{V}\to\mathsf{W}$, there exists an $n\times m$ matrix $B$ such that $T=L_B$. This can be found by letting $\mathbf{w}_i=T(\mathbf{e}_i)$, and letting $B$ be the matrix $B=\Bigl( \mathbf{w}_1|\mathbf{w}_2|\cdots|\mathbf{w}_n\Bigr)$ whose columns are the vectors $\mathbf{w}_i$. Presumably, you have already seen this.

However, for an $n\times m$ matrix $A$, the linear transformation of the form $L_A$ only makes sense when the domain is $\mathbb{F}^n$ and the codomain is $\mathbb{F}^m$. Otherwise, you can't even "multiply $A$ by the vector".

Thus, for example, if $\mathsf{V}=\mathbf{P}_3(\mathbb{R})$, the vector space of polynomials with real coefficients of degree at most $3$ (plus the zero polynomial), and $\mathsf{W}=\mathbb{R}^2$, and $T\colon\mathsf{V}\to\mathsf{W}$ is given by $T(p(x)) = \left(\begin{array}{c}p(0)\\p'(1)\end{array}\right)$, then there is no matrix $A$ such that $T=L_A$, because you can't multiply a polynomial by a matrix to get a $2$-tuple. That's why the assertion in question is false.

That said, what is leading you astray is that given any finite dimensional vector spaces $\mathsf{V}$ and $\mathsf{W}$ over $\mathbb{F}$, and any linear transformation $T\colon\mathsf{V}\to\mathsf{W}$, there is a way to code the linear transformation $T$ using a basis $\beta$ for $\mathsf{V}$, a basis $\gamma$ for $\mathsf{W}$, and a matrix $A$; namely, pick a basis $\beta$ for $\mathsf{V}$, a basis $\gamma$ for $\mathsf{W}$; given a vector $\mathbf{v}$ of $\mathbf{V}$, let $[\mathbf{v}]_{\beta}$ be the coordinate vector of $\mathbf{v}$ relative to $\beta$; let $[\mathbf{w}]_{\gamma}$ be the coordinate vector of $\mathbf{w}$ relative to $\gamma$, and then define $A$ to be the (unique) matrix with the property that $$A[\mathbf{v}]_{\beta} = [T(\mathbf{v})]_{\gamma}.$$ This matrix is called the coordinate matrix of $T$ relative to $\beta$ and $\gamma$, which Friedberg, Insel, and Spence denote $[T]_{\beta}^{\gamma}$.

What is also true is that the map $f$that sends $\mathbf{v}$ to $[\mathbf{v}]_{\beta}$, and the map $g$ that sends $\mathbf{w}$ to $[\mathbf{w}]_{\gamma}$ are isomorphisms between $\mathsf{V}$ and $\mathbb{F}^n$ (where $\dim(\mathsf{V})=n$), and between $\mathsf{W}$ and $\mathbb{F}^m$ (where $\dim(\mathsf{W})=m$); and the matrix $A$ is such that the linear transformation $L_A$ fits into the commutative square: $\require{AMScd}$ $$\begin{CD} \mathsf{V} @>T >> \mathsf{W} \\ @VfV\cong V @V\cong V g V\\ \mathbb{F}^n @>L_A>>\mathbb{F}^m \end{CD}$$

But note that $T$ is not equal to $L_A$; it just corresponds to $L_A$, which is a different assertion.

As I noted in the comments, the True/False question is like saying "Given two speakers, they are talking Russian to one another." Your confusion lies in the fact that, whatever it is they are speaking, we can certainly translate what they are saying into Russian. But the fact that we can do that is not the same as asserting that they are speaking Russian. So the claim is false.