The norm of a derivation

103 Views Asked by At

Let $E$ be an infinite-dimensional complex Hilbert space. For $A\in \mathcal{L}(E)$, consider \begin{eqnarray*} W_{0}(A) &=&\{\alpha\in \mathbb{C}:\;\exists\,(z_n)\subset E\;\;\hbox{such that}\;\|z_n\|=1,\displaystyle\lim_{n\rightarrow+\infty}\langle A z_n,z_n\rangle=\alpha,\\ &&\phantom{++++++++++}\;\hbox{and}\;\displaystyle\lim_{n\rightarrow+\infty}\|Az_n\|= \|A\| \}. \end{eqnarray*} Moreover, we define the map \begin{align*} \delta_A: &\mathcal{L}(E)\longrightarrow \mathcal{L}(E)\\ & X\longmapsto AX-XA . \end{align*}

I want to understand the proof of the following theorem:

Theorem: Let $A\in \mathcal{L}(E)$. If $\lambda\in W_{0}(A)$, then $$\|\delta_A\|\geq2(\|A\|^2-|\lambda|^2)^{1/2}.$$

Proof: If $\lambda\in W_{0}(A)$, then there exists $(x_n)\subset E\;\;\hbox{such that}\;\|x_n\|=1,\displaystyle\lim_{n\rightarrow+\infty}\langle A x_n,x_n\rangle=\lambda,$ and $\displaystyle\lim_{n\rightarrow+\infty}\|Ax_n\|= \|A\|$.

let $Ax_n = \alpha_nx_n + \beta_ny_n$, with $y_n$ a unit vector orthogonal to $x_n$ ; thus $\alpha_n=\langle Ax_n,x_n\rangle\to \lambda$ and $|\alpha_n|^2+|\beta_n|^2=\|Ax_n\|^2\to \|A\|^2$. Let $V_n=x_n\otimes x_n-y_n\otimes y_n$ ; then $\|V_n\|=1$ and $(AV_n - V_n A)x_n = 2y_n \beta_n$. Thus, \begin{align*} \|\delta_A\| & \geq \|\delta_A(V_n)\|\\ & \geq \|(AV_n - V_n A)x_n\|=2|\beta_n|\\ &=2(\|Ax_n\|^2-|\langle A x_n,x_n\rangle|^2)^{1/2} \to 2(\|A\|^2-|\lambda|^2)^{1/2}. \end{align*}

I don't understant the following facts:

  • Why we can write $Ax_n = \alpha_nx_n + \beta_ny_n$ , with $y_n$ a unit vector orthogonal to $x_n$?

  • Why $\|V_n\|=1$ and $(AV_n - V_n A)x_n = 2y_n \beta_n$?

Thank you for your help.

2

There are 2 best solutions below

9
On BEST ANSWER

For your first question, each vector $x\in E$ gives a decomposition $E=\text{span}\{x\}\oplus\text{span}\{x\}^\perp$.

For your second question, if $z\in E$ we have $$\|V_nz\|^2=|\langle z,x_n\rangle|^2+|\langle z,y_n\rangle|^2\leq\|z\|^2,$$ since $x_n\perp y_n$, so $\|V_n\|\leq1$. Furthermore, since $V_nx_n=x_n$ we have $\|V_n\|=1$.

Now the last is just a computation: $$(AV_n-V_nA)x_n=Ax_n-V_nAx_n=(\alpha_nx_n+\beta_ny_n)-(\alpha_nx_n-\beta_ny_n)=2\beta_ny_n$$

2
On

Let $F$ be a closed subspace of $E$. Then you have $E=F\oplus F^\perp$, ie every vector of $E$ has a unique decomposition to a "part in $F$" and a "part orthogonal to $F$". Look specifically at $F=\mathrm{span}(x_n)$, this is a closed subsoace. It follows every vector can be decomposed into a sum of a vector proportional to $x_n$ and a vector orthogonal to $x_n$. Doing this with the vector $Ax_n$ is what is happening the first part of your question.

For the second part, the key is to understand that the operator $x_n\otimes x_n$ is a projection operator (because the norm of $x_n$ is $1$). $y_n\otimes y_n$ is a projection onto an orthogonal space, thus (for example by choosing an ONB that contains $x_n$ and $y_n$) you can see that $$\| a\,x_n\otimes x_n+b\, y_n\otimes y_n\|=\max(|a|,|b|).$$ Also $V_n x_n = x_n$, $V_ny_n=-y_n$ and thus: $$(AV_n-V_nA)x_n = (\alpha_n x_n+\beta_n y_n) - V_n(\alpha_n x_n+\beta_n y_n)= 2\beta_n y_n.$$