every rank one operator is a linear combination of rank one idempotents

111 Views Asked by At

I'm trying to prove a theorem which makes use of the fact that every finite rank operator on a Banach space is a linear combination of rank one idempotents. It's enough to show this for rank one operators.

Actually I have found a proof in a paper but I can't understand it at all. I whish someone could offer a new proof or help me clarify the following argumement:

Given a rank-one operator $u\in B(X)$, there exists $\tau(u)\in\mathbb{C}$ such that $u^2=\tau(u)u$.Moreover, $\tau(u)=0$ or $\tau(u)$ is the only non-zero element in the spectrum of $u$.(What on earth is $\tau$?) Thus if $\tau(u)\neq 0$,then $\tau(u)^{-1}u$ is a minimal idempotent (I think this means a rank-one idempotent), and $u=\tau(u)(\tau(u)^{-1}u)$. Now for $\tau(u)=0$, let $x\in B(X)$, and $\lambda\in\mathbb{C}$ be such that $uxu=u$ and $\lambda\gt r(x)$ ($r(x)$ denotes the spectral radius). Therefore, $e_1=ux$ and $e_2=u(x-\lambda)$ are minimal idempotents satisfying $u=\lambda^{-1}(e_1-e_2)$, which completes the proof.

2

There are 2 best solutions below

0
On BEST ANSWER

Let $u(X)=\operatorname{span}(\{v\})$ where $\|v\|=1$. For every $w\in X$ there is a unique $\tau(w)\in\Bbb{C}$ such that $u(w)=\tau(w)v$. If $\alpha\in\Bbb{C}$ and $s\in X$ then $$u(\alpha w+s)=\alpha u(w) + u(s) = \alpha\tau(w)v+\tau(s)v=(\alpha\tau(w)+\tau(s))v$$ this proves that $\tau$ is a linear functional, taking into account the uniqueness of $\tau$. Moreover $$|\tau(w)| = |\tau(w)|\|v\| = \|\tau(w)v\| = \|u(w)\| \le \|u\|\|w\|$$ proving that $\tau$ is a bounded linear functional.

For any $w$ we have $$u^2(w) = u(u(w)) = u(\tau(w)v) = \tau(w)\tau(v)v = \tau(v) u(w)$$

If $\tau(v)\ne 0$ then let $e(w)=\tau(v)^{-1}u(w)=\tau(v)^{-1}\tau(w)v$. Then $u(w)=\tau(v)e(w)$ and $e^2(w) = \tau(v)^{-1}u(w)\tau(v)^{-1}u(w)=\tau(v)^{-2} u^2(w)= \tau(v)^{-1}u(w)=e(w)$. So $e$ is idempotent and rank $1$ as it has the same image as $u$.

The case $\tau(v)=0$

Let $K=\ker(u)=\ker(\tau)$. Then $v\in K$. Let $w_0$ be any vector $w_0\not\in K$ such that $\tau(w_0) = 1$. By Hahn-Banach there is a bounded linear functional $\beta$ such that $\beta(v)=1$. Define $x$ as $x(w)=\beta(w)w_0$.

For $w\in K$ we have $uxu(w)=0$ and $u(w)=0$. For $w_0$ we have $$uxu(w_0)=u(x(\tau(w_0)v))=\tau(w_0)u(w_0)=u(w_0)$$ Since $X$ is a direct sum of $K$ and $\operatorname{span}(\{w_0\})$ it follows that $uxu = u$.

Now let $\lambda>r(x)$ as in the original post, $e_1=ux$ and $e_2=u(x-\lambda)$. We have $u(x-\lambda)u=uxu-\lambda u^2=uxu=u$ since $u^2=0$ by the assumption that $\tau(v)=0$. $$ \begin{align} (e_1)^2 &= uxux = ux = e_1\\ (e_2)^2 &= u(x-\lambda)u(x-\lambda) = u(x-\lambda) \end{align} $$ So $e_1, e_2$ are both idempotents. $e_1$ has rank $1$ by design since $x$ sends $v$ to $w_0$ and $u$ sends it back to $v$. $e_2$ has rank $1$ since $x-\lambda$ is invertible and $u$ has rank $1$.

Finally, $u=\lambda^{-1}(e_1-e_2)$.

0
On

Here is another argument. Because $u$ is rank-one, there exists $y\in X$ such that $\def\ran{\operatorname{ran}}\ran u=\mathbb Cy$. Fix $x\in X\setminus \ker u$. We have $ux=\beta y$ for some nonzero scalar $\beta$.

  • if $y=\alpha x$ for some scalar $\alpha$, then $ux=\beta y = \alpha\beta x$. An arbitrary element of $X$ is of the form $\gamma x+z$, with $z\in\ker u$. Then $$ u(\gamma x+z)=\gamma ux=\alpha\beta(\gamma x). $$ It follows that $(\alpha\beta)^{-1}u$ is an idempotent.

  • if $x,y$ are linearly independent, we have that $y=\eta x+ z$ with $z\in\ker u$. If we complete $z$ to a basis $\{z\}\cup\{w_j\}$ of $\ker u$, we get that $\{x,y\}\cup\{w_j\}$ is a basis for $X$. As $uw_j=0$ for all $j$, we can think of $u$ as a $2\times 2$ matrix with respect to $x,y$. Since $ux=\beta y$ and $uy=\eta ux=\beta\eta y$, we get that $$ u=\begin{bmatrix}0&0\\ \beta&\beta\eta \end{bmatrix}. $$ If $\eta\ne0$, then $$ u=\frac1{\beta\eta}\begin{bmatrix} 0&0\\ 1/\eta&1\end{bmatrix} $$ is a scalar multiple of an idempotent. Otherwise, if $\eta=0$, we have using Ryszard's suggestion that $$ u=\begin{bmatrix} 0&0\\ \beta&0\end{bmatrix} =\frac\beta2\,\bigg(\begin{bmatrix} 0&0\\ 1&1\end{bmatrix} +\begin{bmatrix} 1&0\\ 1&0\end{bmatrix} -\begin{bmatrix} 1&0\\ 0&1\end{bmatrix} \bigg) $$ is a linear combination of idempotents.