Diagonalizability of linear transformations in direct sums under quotienting

76 Views Asked by At

I think I have an idea of how to provide a proof for the following, but am unsure in a couple of places and would appreciate some advice:

Q: Let $V$ be a finite-dimensional vector space, $U$ be a linear subspace of $V$. $f$ is an endomorphism on $V$ with $f$ restricted to $U$ being an involution. $\pi$ is the canonical epimorphism $\pi:V\rightarrow V/U$. Further, let $f$ restricted to $U$ be diagonalizable, let $\tilde{f}$f be the linear map from $V/U\rightarrow V/U$ such that $\pi\circ f = \tilde{f}\circ\pi$ and let $\tilde{f}$ be diagonalizable. Prove or give a counterexample: $f$ is diagonalizable.

I think $f$ would have to be diagonalizable for the following reasons:

Let $U'$ be an arbitrary but fixed complement of $U$ in $V$.

  • Due to $dim(V)<\infty$, we also know that $dim(U)<\infty$ $\land$ $dim(U')<\infty$.
  • $V=U\bigoplus U'$
  • $f$ is a diagonalizable involution on $U$
  • $\tilde{f}$ decomposes into a direct sum over its eigenspaces in $V/U$
  • $V/U$ is isomorphic to $U'$
  • Each such eigenspace for eigenvalue $\lambda_{i}$ of $\tilde{f}$ has a unique representative $u'_{\lambda_{i}}\in U'$
  • Due to $\pi(u'_{\lambda_{i}})$ being an element of a single eigenspace of $\tilde{f}$, $\tilde{f}(\pi(u'_{\lambda_{i}}))$ must be equal to $\lambda_{i}\pi(u'_{\lambda_{i}})=\pi(\lambda_{i}u'_{\lambda_{i}})=\lambda_{i}u'_{\lambda_{i}}+U$
  • By $\pi\circ f = \tilde{f}\circ\pi$, $\pi(\lambda_{i}u'_{\lambda_{i}})$ must be equal to $\pi(f(u'_{\lambda_{i}}))$.
  • Since $\tilde{f}$ maps $[v]\mapsto f(v)+U$, $\tilde{f}(\pi(u'_{\lambda_{i}}))=\lambda_{i}u'_{\lambda_{i}}+U=f(u'_{\lambda_{i}})+U$
  • The summation of the direct summands for $\tilde{f}$ must correspond to $\pi(f(v))$ for some $v\in V$, and vice versa, for any vector from each preimage of the equivalence class of each eigenspace of $\tilde{f}$, the application of $\pi\circ f$ must correspond to such a direct sum.
  • Thus, since $\lambda_{i}u'_{\lambda_{i}}+U=f(u'_{\lambda_{i}})+U$, we can see that $f(u'_{\lambda_{i}})=\lambda_{i}u'_{\lambda_{i}}$ - which means $f$ decomposes into a scalar multiplication on each of the inverse images of the eigenspaces of $\tilde{f}$, which are linear subspaces of $V$ and must span $U'$.
  • decomposing into scalar multiplications over linear subspaces means being a diagonalizable involution over the sum of those subspaces [this is where I am unsure - is this even correct? is it correct, but only given this specific case?]
  • Since it was given that $f$ restricted to $U$ is a diagonalizable involution, and by the last point, $f$ restricted to $U'$ is also a diagonalizable involution, $f$ must be diagonalizable on V.

I may very well be overlooking something rather elementary, and at any rate seem to be unable to confirm the correctness and completeness of such a proof (mostly due to the penultimate step).

1

There are 1 best solutions below

3
On BEST ANSWER

$f$ is not always diagonalizable. To see why, when you write $ V = U \oplus U'$, and $U' \simeq V/U$, you can write $f$ as : $$f =\begin{pmatrix}f|_U & * \\ 0 & \tilde{f}\end{pmatrix}$$

The bottom-left coefficient is $0$ since $U$ is stable under $f$. Under our hypotheses, $f|_U$ and $\tilde{f}$ are diagonalizable, but we have no information on the top-right coefficients.

To find a concrete counter-example, take $V = \mathbb R^2$, $U = \operatorname{Span}{\begin{pmatrix} 1\\0 \end{pmatrix}}$ and $f = \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix}$. Then $f|_U$ and $\tilde{f}$ are equal to the identity (therefore diagonalizable), but $f$ is not diagonalizable.