Let $A$ be a banach algebra with unit $e$ and $b,c\in A$ be such that $\sigma(b^2-c)\subset \mathbb{R}^+$
Then there exists $x\in A$ such that $$x^2+bx+xb+c=0$$
I want to show this, but lack the insight to proceed
The only thing we got is $(b^2-c-\lambda e )$ is invertible for some $\lambda < 0$
so any suggestions are greatly appreciated.
I'm rusty on my functional analysis, but if you complete the square, you see that you want $x$ so that $(x+b)^2 - b^2 + c = 0,$ or $(x+b)^2 = b^2 - c$. Now, the taylor series for the function $\sqrt{x}$ will converge uniformly on $\sigma(b^2 - c)$, since it is a compact subset of the positive reals, and you can construct a sequence of polynomials in the Banach algebra that build $\sqrt{b^2 - c}$ for you. Then subtract $b$ to get $x$.