Matrix representation of a linear operator

3.7k Views Asked by At

As I'm studying for my final, my book keeps skipping alot of steps and I don't know how tthey get from point a to point b - probably because its elementary at that stage in the book, except not to me

Let T be the linear operator on P2(R) defined by T(f(x)) = f '(x). The matrix representation of T with respect to the standard ordered basis B for P2(R) is

 [T]_B = {(0,1,0), (0,0,2), (0,0,0)}  

The latter is a matrix with each set of () being a row

Can someone show me step by step how they convert the T(f(x)) into a matrix form? I keep running into this problem and don't know how to do it.

Similarly, how do you do the same for vectors?

V = R2 and T(a, b) = (-2a + 3b, -10a + 9b)

In that how would I convert T to a matrix form?

2

There are 2 best solutions below

1
On BEST ANSWER

The standard basis for $P_2(\mathbb R)$ (which I assume to be the set of polynomials of degree at most $2$) is $(1,x,x^2)$, and write $e_1=1$, $e_2=x$, $e_3=x^2$.

The matrix $M$ of $T$ in the base $B$ has as entry $m_{i,j}$ the coefficient of $T(e_j)$ along $e_i$. For example, $T(e_1) = T(1) = 0 = 0e_1$, so that $m_{1,1} = 0$. Simarly, $T(e_3) = T(x^2) = 2x = 2e_2$, so that $m_{2,3} = 2$. Computing all these coefficients, we hence have $$M = \begin{pmatrix} 0 & 1 & 0\\ 0 & 0 & 2\\ 0 & 0 & 0\end{pmatrix}.$$ The first column has only zeros, since the derivative of the polynomial $e_1$ is the polynomial $0$. The derivative of $e_2$ is $1 + 0x + 0x^2$, and the derivative of $e_3$ is $0+2x+0x^2 = 0e_1+2e_2+0e_3$.

For the second example, I assume that you want to take as basis the family $\{(1,0),(0,1)\}$. Let $e_1$ be $(1,0)$ and $e_2$ be $(0,1)$. One has to compute now the four coefficients of the matrix: $T(e_1) = T(1,0) = (-2,-10) = -2e_1 + (-10)e_2$, so that $m_{1,1} = -2$ and $m_{2,1} = -10$. Similarly, $T(e_2) = T(0,1) = (3,9) = 3e_1 + 9e_2$, i.e. $m_{1,2} = 3$ and $m_{2,2} = 9$, which yields the matrix $$N = \begin{pmatrix} -2 & 3\\ -10 & 9\end{pmatrix}.$$

0
On

What does it mean to say a matrix represents a linear transformation???

Suppose $A$ is a matrix for linear transformation $T$ then we should have $Av=Tv$ for all $v\in V$

In case of a $2\times 2$ matrix for a linear transformation on a vector space of dimension $2$ you would have some thing like below....

$\begin{bmatrix}a&b\\c&d\end{bmatrix}\begin{bmatrix}v_1\\v_2\end{bmatrix}=\begin{bmatrix}T(v_1)\\T(v_2)\end{bmatrix}$

you need what $T(v_1),T(v_2)$ is (given $v_1,v_2$) and you want to know what $a,b,c,d$ are..

For that you need the following procedure :

Take basis of $V$ (see for $V=\mathbb{R}^2$ over $\mathbb{R}$)

Basis is $\{(1,0);(0,1)\}$ for $V$

Now see what $T(1,0);T(0,1)$ are..

$T(1,0);T(0,1)$ would be elements in $V$ and as $\{(1,0);(0,1)\}$ is a basis you would write this $T(1,0)$ and $T(0,1)$ as unique linear combination of basis elements

Say you have $T(1,0)=(3,4)$ and $T(0,1)=(7,9)$..

see that $T(1,0)=3(1,0)+4(0,1)$ then you write matrix corresponding to $T$ with first column consisting of coefficients of basis elements of image of first basis element by which i mean $T(v_1)$

So, first column in $A$ is $\begin{bmatrix}3\\4\end{bmatrix}$

Now, second column would be coefficients of image of second vector i.e., $T(0,1)=7(1,0)+9(0,1)$

So, second column of $A$ is $\begin{bmatrix}7\\9\end{bmatrix}$

So, matrix corresponding to $T$ is $\begin{bmatrix}3&7\\4&9\end{bmatrix}$