Find an ordered basis $\mathcal{B}$ for $M_{2\times 2}(\textbf{R})$ such that $[T]_{\mathcal{B}}$ is a diagonal matrix.

641 Views Asked by At

Let $T$ be a linear operator on $M_{n\times n}(\textbf{R})$ defined by $T(A) = A^{t}$.

(a) Show that $\pm 1$ are the only eigenvalues of $T$.

(b) Describe the eigenvectors corresponding to each eigenvalue of $T$.

(c) Find an ordered basis $\mathcal{B}$ for $M_{2\times 2}(\textbf{R})$ such that $[T]_{\mathcal{B}}$ is a diagonal matrix.

MY ATTEMPT

(a) As far as I have understood, an eigenmatrix is a matrix which satisfies $T(A) = \lambda A$. More precisely, $A^{t} = \lambda A$.

From this relation, one gets that \begin{align*} \det(A^{t}) = \det(A) = \det(\lambda A) = \lambda^{n}\det(A) \Longleftrightarrow (\lambda^{n} - 1)\det(A) = 0 \end{align*}

Consequently, $\lambda = \pm 1$ if $n = 2k$ and $\lambda = 1$ if $n = 2k+1$.

My question is: why should we consider $\det(A) \neq 0$?

(b) In the case where $\lambda = 1$, we obtain the relation $A^{t} = A$, that is to say, $A$ is symmetric.

In the case where $\lambda = -1$, we obtain the relation $A^{t} = -A$, that is to say, $A$ is skew-symmetric.

(c) I know that $T(A) = A$ if $A$ is symmetric and $T(A) = -A$ if $A$ is skew-symmetric.

However I do not know how to proceed from here.

Can someone help me to solve it?

2

There are 2 best solutions below

0
On BEST ANSWER

For c). Since each matrix is a linear combination of elementary ones $$ E^{11}=\left[\begin{array}{cc}1&0\\0&0\end{array}\right], E^{12}=\left[\begin{array}{cc}0&1\\0&0\end{array}\right], E^{21}=\left[\begin{array}{cc}0&0\\1&0\end{array}\right], E^{22}=\left[\begin{array}{cc}1&0\\0&0\end{array}\right], $$ that is $A=a_{11}E^{11}+a_{12}E^{12}+a_{21}E^{21}+a_{22}E^{2}$, then $$TA= a_{11}E^{11}+a_{21}E^{12}+a_{12}E^{21}+a_{22}E^{2}.$$ One gets the matrix of the linear transformations by looking at what is the effect on the basis. Thus for the basis you get $$TE^{11}=E^{11}\ \ ,\ \ TE^{12}=E^{21}\ \ ,\ \ TE^{21}=E^{12}\ \ ,\ \ TE^{22}=E^{22},$$ then the matrix of $T$ is $$[T]=\left[\begin{array}{cccc}1&0&0&0\\0&0&1&0\\0&1&0&0\\0&0&0&1\end{array}\right].$$ The theory says that to find the simplest form of this matrix, which correspond to a change of basis for $M_{2\times 2}(\mathbb R)$, is to seek its eigen-vectors, which translate in matrix form as $$ B^1=\left[\begin{array}{cc}0&-1\\1&0\end{array}\right], B^2=\left[\begin{array}{cc}0&0\\0&1\end{array}\right], B^3=\left[\begin{array}{cc}0&1\\1&0\end{array}\right], B^4=\left[\begin{array}{cc}0&0\\0&1\end{array}\right]. $$ It is easily to check that $$TB^1=-B^1\ \ ,\ \ TB^2=B^2\ \ ,\ \ TB^3=B^3\ \ ,\ \ TB^4=B^4.$$ Hence the matrix is this new basis is $$\left[\begin{array}{cccc}-1&0&0&0\\0&1&0&0\\0&0&1&0\\0&0&0&1\end{array}\right].$$

2
On

To answer your questions briefly:

(a) You shouldn’t in principle consider the determinant different from zero. The problem is, your proof breaks down in that case, so you need another way to prove it. How about this: for an “eigenmatrix” A, $T^2(A) = \lambda^2 A = (A^t)^t=A$. Then you conclude (for example, by looking at any component of A different from zero, if thinking about numbers is easier than matrices).

(c) I will give a quick answer that is more general (i.e. valid in dimension n, not just 2). Pick a basis $\{e_i\}$ of your space, and its correspondent dual basis $\{\theta_j\}$, i.e. $\theta_j(e_i) = \delta_{ji}$. Define $E_{ij}:=e_i\otimes\theta_j$, i.e. the operator defined by $E_{ij}(\sum_kv_ke_k) = v_je_i$, or if you wish, the matrix with a 1 in position (i,j) and zeroes everywhere else (in the basis ${e_i}$, of course). These operators clearly form a basis of the space of operators.

Now, the basis you're looking for needs to be made of symmetric or antisymmetric operators, so you can define it with operators of the form $E_{ij} + E_{ji}$ (for all $i$ and $j$) and $E_{ij} - E_{ji}$ (for $i \neq j$). A moment's though can convince you that this works.