Prove that $A\geq I$ implies that $A$ is invertible.

673 Views Asked by At

Here's the question:

Let $A$ be a positive operator on a (possibly infinite dimensional) Hilbert space. Let $I$ denote the identity operator. Suppose that $A \geq I$, which is to say that $A - I$ is a positive operator. Prove that $A$ is invertible.

I think that this is true, but I haven't been able to find a proof one way or the other. I would like to avoid invoking any heavy machinery (like the spectral theorem) if possible. I would also be interested in a proof that carries over to more general $C^*$ algebras.

Of course, the proof in the case of finite dimensional spaces is fairly obvious, since it suffices to show that the operator has a trivial kernel. Since that does not suffice here, I really have no clue what my next move should be.

Any guidance here would be greatly appreciated.

2

There are 2 best solutions below

7
On BEST ANSWER

It shouldn't be to hard to show that $A$ is injective, for if $Ax = 0$ then $$\langle x,x \rangle \le \langle Ax,x\rangle =0.$$

It will follow that $A$ is invertible once we show that $A$ is surjective: the range $R(A)$ satisfies $R(A) = H$.

Let $A^*$ denote the adjoint of $A$. Suppose that $y \in N(A^*)$ so that $A^*y = 0$. Then $$0 = \langle y,A^*y \rangle = \langle Ay,y \rangle \ge \langle y,y \rangle$$ so that $N(A^*) = \{0\}$. Consequently $R(A)^\perp = \{0\}$.

You can use the projection theorem to conclude that $R(A) = H$ provided that $R(A)$ is closed. Suppose that $\{y_k\}$ is a sequence in $R(A)$ that converges to a point $y \in H$ and let $x_k \in H$ satisfy $A x_k = y_k$. Then $$ \langle x_k - x_j, x_k - x_j \rangle \le \langle Ax_k - Ax_j, x_k - x_j \rangle \le \|Ax_k - Ax_j\| \|x_k - x_j\|.$$ Since $\{A x_k\}$ is Cauchy in $H$, it follows that $\{x_k\}$ is Cauchy too, hence $x_k \to x$ for some $x$, which by continuity will satisfy $Ax = y$. Thus $R(A)$ is closed.

5
On

Since $A$ is positive, its spectrum is contained in $[0,\infty)$. Then the spectrum of $A+I$ is contained in $[1,\infty)$; thus $0$ is not in the spectrum of $A+I$ and $A+I$ is invertible.

To justify that the spectrum translates note that $A+I-\lambda I=A-(\lambda-1)I$. So $A+I-\lambda$ is not invertible precisely when $A-(\lambda -1)I$ is not invertible, i.e. when $\lambda-1$ is in the spectrum of $A$. So $$ \sigma(A+I)=\{\lambda+I:\ \lambda\in\sigma(A)\} $$