Given linear map, find nonzero polynomials such that...

59 Views Asked by At

Consider the linear map \begin{align*} \theta : P_1 &\to P_1 \\ a+bx &\mapsto (3a + 4b) - (2a + 3b)x \end{align*}

  1. Find a non zero polynomial $f \in P_1$ such that $\theta(f) = f$.
  2. Find a non zero polynomial $g \in P_1$ such that $\theta(g) = -g$.
  3. Show that $f,g$ is a basis of $P_1$, and find the matrix representing $\theta$ in the basis of $f, g$.

I am unsure how to go about solving 1 and 2 without just plugging in numbers for $a$ and $b$ until the two sides match. Is there a better way to go about solving 1 and 2 that is possibly faster than just plugging in numbers? These problems are on a worksheet dedicated to a certain section of my textbook that explains matrix similarity and diagonalizability so I feel there is something that I am missing that would allow me to solve 1 and 2 using similarity or diagonalizability.

2

There are 2 best solutions below

1
On BEST ANSWER

There is indeed a connection between this and diagonalisability, but nothing that will help you solve this problem faster than the method laid out by @innerproduct.

Question 1 is asking you for an eigenvector $f$ of $\theta$, corresponding to eigenvalue $1$. That is, a non-zero vector $f$ such that $\theta(f) = 1f$. Question 2 is asking you, similarly, for an eigenvector $g$ of $\theta$, corresponding to eigenvalue $-1$.

If you can find such vectors, then $1$ and $-1$ are indeed eigenvalues of $\theta$. Because eigenvectors of distinct eigenvalues are automatically linearly independent, $f$ and $g$ would have to be linearly independent. Since $P_1$ has dimension $2$, this would make $B = (f, g)$ a basis for $\theta$, and in fact, a basis of eigenvectors. As such, the corresponding $2 \times 2$ matrix of $\theta$ corresponding to $B$ must be diagonal, with the corresponding eigenvalues down the diagonal. In particular, $$\begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix}.$$

I reckon this exercise is to illustrate two things:

  1. To see diagonalisation in action, and verify that it works, and
  2. To see that diagonalisation applies not just to linear operators on $\Bbb{R}^n$, but to linear operators on other finite-dimensional vector spaces.

I would suggest finding polynomials $f$ and $g$ exactly as @innerproduct suggests, by turning the problem into two systems of two linear equations of $a$ and $b$, and solving simultaneously. You'll find that the two equations in each system are actually just multiples of each other, and you can choose any non-zero value of $b$ (e.g. just use $b = 1$, or whatever you want) and find a suitable (non-zero) value of $a$. These values are the coefficients of $1$ and $x$ in your polynomial.

Alternatively (and to be clear, I wouldn't recommend this approach when solving this problem for the first time) you could also turn the whole operator into a matrix. If we pick a basis $S = (1, x)$ for $P_1$, we can compute the corresponding matrix like so: \begin{align*} \theta(1) = 3 - 2x &\implies [\theta(1)]_S = \begin{pmatrix} 3 \\ -2 \end{pmatrix} \\ \theta(x) = 4 - 3x &\implies [\theta(x)]_S = \begin{pmatrix} 4 \\ -3 \end{pmatrix}. \end{align*} Therefore, $$[\theta]_S = \begin{pmatrix} 3 & 4 \\ -2 & -3 \end{pmatrix}.$$ The eigenvalues of this matrix are precisely the eigenvalues of $\theta$ (which, if you compute them, should be $\pm 1$). The eigenvectors of this matrix are the coordinate vectors of the eigenvectors of $\theta$. So, if you're more comfortable with matrices, you can do all the same calculations, leaving polynomials behind for the most part.

Anyway, I hope that helps.

0
On

Just use the definition of the map $\theta$. Let $f(x)=a+bx$. Then, for $1$, you want

$$\theta(a+bx)=(3a+4b)-(2a+3b)x=a+bx.$$

We can equate the coefficients on each side. This gives the system of equations

$$3a+4b=a$$ $$-(2a+3b)=b,$$

which you can solve for $a,b$ to ultimately get $f$. After you solve the system, use the definition of $\theta$ to verify that it works. Same process for number 2.