Roots in Banach algebras.

105 Views Asked by At

I'm studying positive functionals on normed algebras and I got stuck in the following problem:

Let $A$ be a unital Banach algebra, and $x\in A$ be such that $\Vert x\Vert <1$. Then the series $$\sum_{k=0}^\infty\dfrac{1}{k!}\left(\dfrac{1}{2}-0\right)\cdots\left(\dfrac{1}{2}-k+1\right)x^k$$ converges (absolutely) to and element $y\in A$ such that $y^2=1_A+x$.

The series above is just the Taylor series of $\sqrt{1+x}$ centered at $0$. That the series converges absolutely is simple (one can use, for example, the root test). The problem is to show that $y^2=1+x$.

Let $a_k=\dfrac{1}{k!}\left(\dfrac{1}{2}-0\right)\cdots\left(\dfrac{1}{2}-k+1\right)$. It can be shown by induction on $k$ that $a_k=\dfrac{(-1)^k(2k)!}{(1-2k)(k!)^2(4^k)}$ (Wikipedia)

Since the series $y=\sum_k a_kx^k$ converges absolutely, then $$y^2=\sum_{n=0}^\infty\sum_{k=0}^n a_k a_{n-k}x^n$$

If we calculate the initial terms, we get $y=1+x+0x^2+0x^3+0x^4+0x^5+\cdots$, but I'm having trouble showing that $\sum_{k=0}^na_ka_{n-k}=0$ for every $n\geq 2$. I've tried induction, but it didn't work for me.

1

There are 1 best solutions below

0
On BEST ANSWER

Since $\Vert x\Vert<1$ then its spectrum $\sigma(x)\subset r\mathbb{D}$ for some $r<1$, so we have well defined holomorphic calculus of element $x$ $$ \gamma_x:\mathcal{O}(r\mathbb{D})\to A, f\mapsto \gamma_x(f):=f(x) $$ which in fact is unital involutive $*$-homomorphism. Note that $h(z)=\sqrt{1+z}\in\mathcal{O}(r\mathbb{D})$ and we get some $y=h(x)\in A$. Finally $$ y^2=h(x)h(x)=\gamma_x(h)\gamma_x(h)=\gamma_x(h\cdot h)=\gamma_x(1+z)=1+x $$