I have a linear transformation: $$T_1:\Bbb R^2 \rightarrow \Bbb R^2, T_1(a_1,a_2) = (a_1+a_2, 2a_1+4a_2)$$ The following ordered bases of $\Bbb R^2$ are given as such: $ \beta = \{(1,2),(0,1)\}$ and $\gamma$ is the standard basis of $\Bbb R^2$.
I need to compute
$[T_1]_\beta^\gamma$
I'm doing this:
$T_1(1,2) = (3,6)$ // $\beta$, row 1
$T_1(0,1) = (1,4)$ // $\beta$, row 2
So $[T_1]_\beta=\{(3,6),(1,4)\}$ ... right?
... Now what? How do I "plug" that into $\gamma$ (or plug $\gamma$ into that?)
I keep thinking I understand it, then I do all the computations and theorems aren't being proven. Ultimately, this problem has the following parts; I need to compute a $[T_2]_\alpha^\beta$, then show that $[T_1T_2]_\alpha^\gamma = [T_1]_\beta^\gamma[T_2]_\alpha^\beta$. My book shows one example of a transformation with respect to a basis, but nothing about when bases change -- it has sums and proofs and whatnot, but I'm finding that more confusing than anything. I can't find any step-by-step example to understand what exactly it's saying.
That's the way I would do it. Someone please correct me if I do something incorrect. I'm still studying Linear Algebra.
$[T1]_{b}^{\gamma}$ It's a linear transformation that get some vector in the standard basis ($\gamma$) for $R^2$, apply the transformation, and then return it in the basis $b$.
Starting with that, we can affirm that: $$ [T1]_{b}^{\gamma} = [I]_{b}^{\gamma}\cdot [T1]_{\gamma}^{\gamma} $$ $[I]_{b}^{\gamma}$ is a change of basis from $\gamma$ to $b$.
Now, all we need to do is get the matrix that represents $[I]_{b}^{\gamma}$ and multiply to the matrix of the transformation $[T1]_{\gamma}^{\gamma}$.
The column vectors for the matrix of transformation $[T1]_{\gamma}^{\gamma}$ is going to be the transformed vectors that compose the basis $\gamma$: $$ (1,0) \rightarrow (1+0,2\cdot 1 + 4\cdot 0) = (1,2)\\ (0,1) \rightarrow (0+1,2\cdot 0 + 4\cdot 1) = (0,4)\\ \left[\begin{matrix} 1 & 0 \\ 2 & 4 \end{matrix}\right] $$ Now for the matrix of change of basis $\gamma$ to $b$, we need to decompose $\gamma$ vectors in terms of $b$ vectors: $$ (1,0) = a(1,2)+b(0,1) \rightarrow a = 1 , b = -2\\ (0,1) = c(1,2)+d(0,1) \rightarrow c = 0 , d = 1\\ \left[\begin{matrix} 1 & 0 \\ -2 & 1 \end{matrix}\right] $$
Now multiplying the matrices, we get:
$$ \left[\begin{matrix} 1 & 0 \\ -2 & 1 \end{matrix}\right] \cdot \left[\begin{matrix} 1 & 0 \\ 2 & 4 \end{matrix}\right] = \left[\begin{matrix} 1 & 0 \\ 0 & 4 \end{matrix}\right] $$
Now, to check if everything is correct, let's pick a random vector from $R^2$ describe it in terms of $\gamma$: $$ V_{\gamma} = (a,b) $$ Then we apply the transformation $[T1]_{b}^{\gamma}$: $$ \left[\begin{matrix} 1 & 0 \\ 0 & 4 \end{matrix}\right]\cdot \left[\begin{matrix} a \\ b \end{matrix}\right] = \left[\begin{matrix} a \\ 4b \end{matrix}\right] $$ Now we just need to check if we apply $[T1]_{\gamma}^{\gamma}$ and then apply $[I]_{b}^{\gamma}$ is going to give us the same result: $$ \left[\begin{matrix} 1 & 2 \\ 0 & 4 \end{matrix}\right] \cdot \left[\begin{matrix} a \\ b \end{matrix}\right] = \left[\begin{matrix} a \\ 2a + 4b \end{matrix}\right]\\ \left[\begin{matrix} 1 & 0 \\ -2 & 1 \end{matrix}\right] \cdot \left[\begin{matrix} a \\ 2a + 4b \end{matrix}\right] = \left[\begin{matrix} a \\ 4b \end{matrix}\right] $$
So it's indeed correct.
I really don't know if I was clear enough, of if it's 100% correct. I hope to have contributed a little with your learning proccess. Feel free to leave a comment if you have had any doubts on what I've done...
Let's share some knowledge :) Thanks!