Let $x$ be an element of a Banach Algebra. Let $\lambda \in \rho(x)$, where $\rho(x)$ is the resolvent of $x$.
Let ${d(\lambda, \sigma(x))}$ be the distance between $\lambda$ and $\sigma(x)$.
Show that $\| (\lambda-x)^{-1} \| \geq \frac{1}{d(\lambda, \sigma(x))}$.
Information I know:
$\rho(x)$ is an open set.
$\sigma(x)$ is compact.
I've seen many places using this inequality, but I cannot find a proof of it. And I'm not sure how to prove this by breaking down the definitions I know.
Thanks in advance!
Let $\mu \in \sigma (x)$. If $|\lambda -\mu| <\frac 1 {\|(\lambda -x)^{-1}\|}$ then $e+(\lambda -x)^{-1} (\mu-\lambda)$ is invertible and the identity $(\mu -x) =(\lambda -x)[e+(\lambda -x)^{-1} (\mu-\lambda)]$ shows that $\mu-x$ is invertible. This contradicts the fact that $\mu \in \sigma (x)$. We have proved that $|\lambda -\mu| \geq \frac 1 {\|(\lambda -x)^{-1}\|}$ whenever $\mu \in \sigma (x)$. Taking infimum over all such $\mu$ we get the desired inequality.