For $2 \times 2$-matrix, $\|A^2\|=\|A\|^2$ implies that A is normal

265 Views Asked by At

I am reading book "A Hilbert space problem book", written by Halmos and getting some trouble.

In problem 205, he claim that

For two-by-two matrices an unpleasant computation proves a strong converse: if $\|A^2\|=\|A\|^2$, then $A$ is normal.

I tried to use the determinant as the norm of $A$, but it seems to be useless.

I also found a similar question on Mathstackexchange. Someone said that we should only consider matrix $\begin{pmatrix} \lambda & 1\\ 0 & \lambda \end{pmatrix}$. I, however, do not understand why.

Could you please give me some hint to solve this claim?

Thank you for your time.

2

There are 2 best solutions below

0
On

He probably meant to use the induced $2$-norm. The statement does not always hold if another norm is used. E.g. suppose the induced $\infty$-norm is used, where $\|A\|=\max_i\sum_j|a_{ij}|$. Then $\|A^2\|=\|A\|^2$ for every $n\times n$ row stochastic matrix (unless $n=1$), but obviously some row stochastic matrices are not normal.

The statement is correct when the induced $2$-norm $\|A\|=\sqrt{\lambda_\max(A^\ast A)}$, is used. The "unpleasant computation" probably means to calculate and compare $\|A^2\|=\sqrt{\lambda_\max\left((A^2)^\ast A^2\right)}$ and $\|A\|^2=\lambda_\max(A^\ast A)$ directly, and that really is unpleasant.

An easier way to prove the statement is to employ singular value decomposition. Let $A=USV^\ast$ be a singular value decomposition, where the two singular values of $A$ or $S$ are $\sigma_1\ge\sigma_2$. When $\sigma_1=\sigma_2$, we have nothing to prove because $A=UV^\ast$ is a unitary matrix, which is normal. So, suppose $\sigma_1>\sigma_2$. By scaling $A$ appropriately, we may also assume that $\|A\|=1$.

Now, from $\|A^2\|=\|A\|^2=1$, we get $\|USV^\ast USV^\ast\|=1$ and in turn $\|SV^\ast US\|=1$. Let $x$ be a unit singular vector of $SV^\ast US$ corresponding to the singular value $\sigma_1=1$, so that $\|x\|=\|SV^\ast USx\|=1$. The mapping $x\mapsto SV^\ast USx$ is a function in the form of $f\circ g\circ f$, where $f:x\mapsto Sx$ always shrinks the norm of a unit vector $x$ unless $x$ is a unit multiple of $e_1=(1,0,\ldots,0)^T$, and $g:x\mapsto V^\ast Ux$ preserves the norm of a vector $x$. It follows that $\|f\circ g\circ f(x)\|=1$ only if both $x$ and $V^\ast Ue_1$ are unit multiples of $e_1$. Hence $D:=V^\ast U$ is a diagonal matrix (because $V^\ast U$ is unitary $2\times2$). Thus $A=USV^\ast=U(SD)U^\ast$ is normal.

0
On

I'll give an alternative proof not using the SVD decomposition.

We have $$ \Vert A^2\Vert = \max_{x\ne0}\frac{\Vert A^2x\Vert}{\Vert Ax\Vert}\cdot\frac{\Vert Ax\Vert}{\Vert x\Vert} \le \Vert A\Vert^2. $$ For this to be an equality, there must be an $x$ such that both factors above are maximized. Now, $\Vert Ax\Vert / \Vert x\Vert$ is maximized when $x$ is an eigenvector of $A^*A$ with the greatest eigenvalue (i.e. the greatest singular value of $A$).

So let $\sigma$ be the greatest singular value of $A$, and let $M$ be the corresponding eigenspace of $A^*A$. Then $\Vert A^2\Vert = \Vert A\Vert^2$ implies that for some $x$, we have $x, Ax \in M$. Cases:

  • $\sigma = 0$: $A=0$.
  • $\dim M=2$: $A^*A=\sigma I$, so $A$ a scalar times a unitary transformation.
  • $\dim M=1$ (the interesting case): Since $Ax\in M$, $x$ is an eigenvector of $A$ with eigenvalue, say, $\mu$. It's not hard to see that $|\mu|=\sqrt\sigma>0$. Consider a non-zero vector $y$ orthogonal to $x$. Then $$ \langle x, Ay\rangle = \frac1\mu\langle Ax, Ay\rangle = \frac\sigma\mu\langle x, y\rangle = 0. $$ So $Ay\in M^\perp$, meaning that $y$ is in fact an eigenvector. Thus, $A$ is normal, since $\{x,y\}$ is an orthogonal eigenbasis.