Let $A$ be a $C^*\!$-algebra. Suppose $x$ is a normal element of $A$ and $\operatorname{spect}(x)$ lies in $\mathbb{R}$. Prove that $x$ is self-adjoint.
I tried the following: using $\operatorname{spect}(x)=\overline{\operatorname{spect}(x^*)}$ conclude that $\lambda -a=\lambda-a^* \implies a=a^*$ for $\lambda$ in $\operatorname{spect}(a)$. Is this valid?
There may well be a simpler solution. (But see a certain comment below.)
Since a $C^*$ algebra is semi-simple it's enough to show that $\phi(x^*)=\phi(x)$ for every complex homomorphism $\phi$ of $A$. Since we're given that $\phi(x)$ is real, it's enough to show $$\phi(x^*)=\overline{\phi(x)}.$$I was surprised to find a few years ago when I was working this out that that last bit cannot be proved by purely algebraic fiddling; it depends on completeness! There are simple examples of things satisfying all of the definition of "$C^*$ algebra" except completeness of the norm, where $\phi(x^*)\ne\overline{\phi(x)}$ in general.
EDIT: The example is described at the bottom of the post, due to popular demand.
Now to show that $\phi(x^*)=\overline{\phi(x)}$ it's enough to show that if $y*=y$ then $\phi(y)\in\mathbb R$. Given this, then in general $\phi(x+x^*)$ and $\phi((x-x^*)/i)$ are both real, and it follows that $\phi(x^*)=\overline{\phi(x^*)}$.
So we need this:
Lemma If $x=x^*$ then $\phi(x)\in \mathbb R$.
The standard proof: For $y$ in any Banach algebra with identity define $$\exp(y)=\sum_{n=0}^\infty\frac{y^n}{n!},$$where $y^0=\mathbb 1$, the identity. If $y_1y_2=y_2y_1$ then it's easy to see that $$\exp(y_1+y_2)=\exp(y_1)\exp(y_2).$$Continuity of the involution and of the complex homomorphism $\phi$ show that $$\exp(x^*)=(\exp(x))^*,\quad\phi(\exp(x))=\exp(\phi(x)).$$
Suppose now that $x^*=x$ and define $z:\mathbb R\to Z$ by $$z(t)=\exp(itx).$$ Since $x^*=x$ we have $$z^*z=\exp((itx)^*)\exp(itx)=\exp(it(-x^*+x^*))=\exp(0)=\mathbb 1.$$That is, $z$ is unitary. Now $||z|||^2=||z^*z||=1$, so $||z||=||z^*||=1$, so $|\phi(z)|\le1$ and $|\phi(z^*)|\le1$. On the other hand, $\phi(z^*)\phi(z)=1$. So $|\phi(z)|=|\phi(z^*)|=1$.
Hence for every real $t$ we have $$\left|\exp(it\phi(x))\right|=|\phi(z(t))|\le1,$$and hence $\phi(x)\in\mathbb R$.
For that example. Let $A$ be the space of trigonometric polynomials $$f(t)=\sum_{n=-N}^Nc_ne^{int},$$with norm $$||f||=\sup_{t\in\mathbb R}|f(t)|$$and involution $$f^*=\overline f.$$(So the completion of $A$ is just $C(\mathbb T)$ with the usual norm and involution.) Define $$\phi(f)=\sum_{n=-N}^Nc_n2^n.$$It's clear that $\phi(f^*)\ne\overline{\phi(f)}$. If it's not clear that $\phi$ is multiplicative, note that $A$ can be viewed as the space of finite Laurent series (i.e. rational functions with no poles except possibly at $0$ and $\infty$) and if we look at $A$ that way then $\phi(f)=f(2)$. (If we look at $A$ as a set of rational functions then the description of the involution takes more space, hence the official definition as the space of trigonometric polynomials.)
Note that $\phi$ is not continuous; the automatic continuity of complex homomorphisms also depends on completeness.