Operator norm of a self adjoint operator.

1k Views Asked by At

Let $T$ be a self adjoint operator on $\Bbb C^n$. Let $\lambda_1,\lambda_2, \cdots ,\lambda_n$ be the eigen values of $T$. Then show that the operator norm of $T$ i.e. $\|T\| = \mathrm {max}_j\ |\lambda_j|$.

I am trying to prove it in the following way. I have just proved that $\|T\|= \mathrm {sup} \left \{|\left < Tx,x \right > |\ :\ \|x\|=1 \right \}$. Also since $T$ is self adjoint it is normal. So by Spectral Theorem for Normal Operators we have a orthonormal basis $\{X_1,X_2, \cdots ,X_n \}$ for $\Bbb C^n$ each vector of which is an eigen vector of $T$. WLOG let us assume that $\lambda_j$ be an eigen value of $T$ corresponding to the eigen vector $X_j$, for $j=1,2, \cdots ,n$. Then $|\left < TX_j,X_j \right > | = |\left <\lambda_jX_j,X_j \right > | = |\lambda_j|$ (since $\left < X_j,X_j \right >=1$) for $j=1,2, \cdots ,n$. Since $\|X_j\|=1$ for $j=1,2, \cdots ,n$ so we have $\mathrm {max}_j\ |\lambda_j| \le \|T\|$ by the above proven result. But I find difficulty to prove the reverse inequality.

For the reverse I started by taking a vector $X \in \Bbb C^n$ with $\|X\|=1$. Then there exist scalars $c_1,c_2, \cdots ,c_n \in \Bbb C$ such that $X=c_1X_1+c_2X_2+ \cdots +c_nX_n$. Since $X_j$'s are orthonormal we have $\|X\|=\sqrt {|c_1|^2+|c_2|^2+ \cdots +|c_n|^2}$. So we have $|c_1|^2+|c_2|^2+ \cdots +|c_n|^2=1$. Again by orthonormality of $X_j$'s we have

$\left <TX,X \right > = |c_1|^2\lambda_1+|c_2|^2\lambda_2+ \cdots +|c_n|^2\lambda_n \le (|c_1|^2+|c_2|^2+ \cdots + |c_n|^2)(\lambda_1+\lambda_2+ \cdots +\lambda_n)=\lambda_1+\lambda_2+\cdots +\lambda_n$.

Hence by triangle inequality we have

$|\left <TX,X \right >| \le |\lambda_1|+|\lambda_2|+ \cdots +|\lambda_n|$.

Which clearly doesn't meet my purpose. So how do I proceed to prove the reverse inequality? Please help me in this regard.

Thank you very much.

2

There are 2 best solutions below

1
On BEST ANSWER

You are basically correct, but the issue is with this inequality you used: $$|c_1|^2\lambda_1+\dots+ |c_n|^2\lambda_n \leq (|c_1|^2+\dots+|c_n|^2)(\lambda_1+\dots+\lambda_n)$$ This is simply not tight enough. Notice that the RHS has a lot more terms than the LHS, and there is no reason to assume that they are small, since there is no relation between the terms $\lambda_j$ and $|c_k|^2$. In fact, equality is only achieved in trivial cases. What you want to do is this: $$ |c_1|^2\lambda_1+\dots+ |c_n|^2\lambda_n \leq (|c_1|^2+\dots+|c_n|^2)\max_{i=1,\dots,n}\lambda_i=\max_{i=1,\dots,n}\lambda_i$$ This is a lot more reasonable because we are only using the inequality $\lambda_j\leq \max_{i=1,\dots,n}\lambda_i$ for all $j$, and the equality is achieved for at least one $j$ at all times.

0
On

In your last estimation, do $$ \left <TX,X \right > = |c_1|^2\lambda_1+|c_2|^2\lambda_2+ \cdots +|c_n|^2\lambda_n \le |c_1|^2\color{red}{\lambda_\max}+|c_2|^2\color{red}{\lambda_\max}+ \cdots + |c_n|^2\color{red}{\lambda_\max}=(|c_1|^2+|c_2|^2+\ldots+|c_n|^2)\color{red}{\lambda_\max}=\color{red}{\lambda_\max}. $$