As usual, math is kicking me.
I think the title says it all, be unless it gets changed to something simpler and still specific, here's my question: For an ordered basis $E$ in $\Bbb{R}^n$, if $L_1:\ \Bbb{R}^n\ \longrightarrow\ \Bbb{R}^n$ and $L_2:\ \Bbb{R}^n\ \longrightarrow\ \Bbb{R}^n$ have the same matrix representation with respect to $E$, then $L_1=L_2$?
The answer, apparently, is yes, but it didn't make a whole lot of sense to me. Here's my probably undercooked attempt at proving it:
Assume $L_1$ and $L_2$ have the same matrix representation with respect to $E$, we'll call it $A$. Since $L_1(x) = A(x) = L_2(x)$, we can say that $L_1 = L_2$, maybe?
I feel like I'm still missing some more proof, even though I'd believe it if someone told me.
Hey, if anyone checks this out, could someone please recommend a linear algebra book in simple English? I am absolutely not mathematically gifted/driven/inclined.
Many thanks for your time, -Jon
Your idea is absolutely right. In this context it might help your understanding to include the basis $E$ in the notation; this helps distinguish between matrices and the functions they represent. I like the notation ${}_C[T]_B$ for the matrix representing the linear function $T$ with respect to the basis $B$ on the domain and the basis $C$ on the codomain.
The premise of your question is that ${}_E[L_1]_E={}_E[L_2]_E=A$. And indeed this means that for every $x\in\Bbb{R}^n$ we have $$L_1(x)={}_E[L_1]_Ex=Ax={}_E[L_2]_Ex=L_2(x),$$ which of course means that $L_1=L_2$.