Show that $\lbrace S_n x \rbrace$ converges for a particular recursively-defined sequence of operators $S_n$

86 Views Asked by At

$H$ is a Hilbert space, $M$ is a self-adjoint bounded linear operator on $H$ with $M \leq I$, and $S_0 = 0$; $S_{n+1} = (1/2)(M + S^2_n)$ for $n = 0, 1, 2, ...$. For all $n$, both $S_n$ and $S_n - S_{n-1}$ are polynomials in $M$ with nonnegative, real coefficients, $S_n \geq 0$ and $S_n - S_{n-1} \geq 0$, and $||S_n|| \leq 1$. Show that for each $x$ in $H$, the sequence $\lbrace S_n x \rbrace$ converges.

This is a piece of a problem whose goal is to find a "square root" for positive self-adjoint operators, and many of the facts above come from earlier parts of the problem. I'm just not quite sure what approach to take. I know that since $||S_n|| \leq 1$, $||S_n x|| \leq ||x||$ for every $x$ in $H$, so the sequence is bounded, so it has a convergent subsequence $\lbrace S_{n_k} \rbrace$, say with limit $S$. I'm having trouble coming up with a good bound on $||Sx - S_n x||$ at this point to get the rest of the convergence, or else simply bounding $||S_m x- S_n x||$ to show the sequence is Cauchy.

I would just like a hint on which of the several facts about the sequence should be used, and how, to get the rest of the way to the sequence converging.

1

There are 1 best solutions below

1
On BEST ANSWER

We have to show that if $(U_n)_{n\geqslant 1}$ is a sequence of self-adjoint operators such that $\lVert U_n\rVert\leqslant 1$ for each $n$ and $U_n-U_{n-1}$ is positive for each $n$, then $(U_nx)_{n\geqslant 1}$ is (strongly) convergent for each $x$.

We start from a Cauchy-Schwarz type inequality: $$|\langle Ax,y\rangle|^2\leqslant \langle Ax,x\rangle\cdot \langle Ay,y\rangle$$ for each $x,y\in H$ where $A$ is a non-negative bounded self-adjoint operator. We then use this inequality with $A:=U_{m+n}-U_n$, some fixed $x$ and $y:=U_{m+n}x-U_nx$.