So I'm having an issue with this problem. I don't quite know what to do with it, or how to approach it. I have the solution, as well as a more detailed solution (found online), but they don't really explain the whole thing properly. There are just sort of jumps in logic that don't make sense to me.
Anyway, here's the question:
Let
$b_1 = (1, 1, 0)^T$ , $b_2 = (1, 0, 1)^T$ , and $b_3 = (0, 1, 1)^T$
and let $L$ be the linear transformation from $\mathbb R^2$ into $\mathbb R^3$ defined by:
$L(X) = x_1b_1 + x_2b_2 + (x_1 + x_2)b_3$
Find the matrix $A$ representing $L$ with respect to the ordered bases $\{e_1, e_2\}$ and $\{b_1, b_2, b_3\}$.
I know what $e_1$ and $e_2$ represent and that $A$ is going to be a $3x2$ matrix.
$L(x) = Ax$
However, I'm not sure how to use this to find the correct answer. According to the solution online, I'm supposed to find the coordinate vector first, but the solution doesn't explain how to do this in any way other than plopping the answer in my face.
Can someone please explain how to obtain the solution?
Using the linearity of $L$, we see that
$$L(x_1e_1 + x_2e_2) = x_1b_1 + x_2b_2 + (x_1+x_2)b_3 \\ \implies \begin{cases}x_1L(e_1) =x_1( b_1+b_3) \\ x_2L(e_2) = x_2(b_2 + b_3)\end{cases}$$
Then we just need to stick the coordinates of the vectors $L(e_1) = b_1 + b_3$ and $L(e_2) = b_2 + b_3$ into the columns of a matrix:
$$A = \pmatrix{1 & 0 \\ 0 & 1 \\ 1 & 1}$$
Let's test this to make sure it works:
$$Ae_1 = \pmatrix{1 & 0 \\ 0 & 1 \\ 1 & 1}\pmatrix{1 \\ 0} = \pmatrix{1 \\ 0 \\ 1} = b_1 + b_3 = L(e_1) \\ Ae_2 = \pmatrix{1 & 0 \\ 0 & 1 \\ 1 & 1}\pmatrix{0 \\ 1} = \pmatrix{0 \\ 1 \\ 1} = b_2 + b_3 = L(e_2)$$
Yep. It works. ;)