I am attempting to solve the following problem:
"Let $T: V \longrightarrow W$ be a linear transformation from an n-dimensional vector space V to an m-dimensional vector space W. Let $\beta$ and $\gamma$ be ordered bases for V and W, respectively. Prove that $rank(T)$ = $rank(L_A)$ and that $nullity(T)$ = $nullity (L_A)$, where $A = [T]_\beta ^\gamma$."
To be honest I'm pretty lost with this one. Is the reason for being given $A = [T]_\beta ^\gamma$ to suggest that we must utilize the columns of the matrix somehow in the proof? I believe column vectors relate somehow to the rank of a matrix?
In case a definition for $L_A$is needed the following is given in my book: "Let A be an $m \space x \space n$ matrix with entries from a field $F$. We denote by $L_A$ the mapping $L_A: F^n\longrightarrow F^m$ defined by $L_A(x) = Ax$ (the matrix product of A and x) for each column vector $x \in F^n$. We call $L_A$ a left-multiplication transformation."
Thanks in advance!
Note that we have a commutative diagram
Here, $I_\beta$ is the linear map defined by $$ I_\beta(v)=(\lambda_1,\dotsc,\lambda_n) $$ where $\beta=\{v_1,\dotsc,v_n\}$ and $\lambda_1,\dotsc,\lambda_n$ are the unique scalars satisfying $$ v=\lambda_1 v_1+\dotsb+\lambda_n v_n $$ The map $I_\gamma$ is defined similarly.
Fact 1. $I_\beta$ and $I_\gamma$ are invertible.
Fact 2. $I_\gamma\circ T=L_A\circ I_\beta$
See if you can prove these facts yourself! Once these two facts have been established, your claim is easy to prove.
Suppose that $\{k_1\dotsc,k_\ell\}$ is a basis for $\ker T$, so $\DeclareMathOperator{nullity}{nullity}\nullity(T)=\ell$.
We claim that $\{I_\beta(k_1),\dotsc,I_\beta(k_\ell)\}$ is a basis for $\ker L_A$. The proof of this involves two steps.
These two steps combine to prove that $\nullity(L_A)=\ell=\nullity(T)$. That $\DeclareMathOperator{rank}{rank}\rank(L_A)=\rank(T)$ then follows from the rank-nullity theorem.