Let $A$ be a bounded linear operator on the Banach space $X$. Assuming that $AK = KA$ for every compact operator $K$, how do I show that $A$ must be a scalar multiple of the identity, i.e., we have $A = \lambda I$ for some number $\lambda$.
So far I attempt to solve this using Schur's lemma but I can't reason that $A$ will have an eigenvalue in the first place for the lemma to applies (the book where I got the problem from doesn't even specify if the field is real or complex).
Any hint or help is highly appreciated.
I think we can just play around with some special compact operators, unless I'm missing some subtlety. If $x$ and $y$ are linearly independent, then their span is complemented in $X$, so that $X \cong span\{x, y\} \oplus V$ for some closed subspace $V$ of $X$. The projection $P$ along $V$ onto $span\{x, y\}$ is then bounded. Define a map $B: P(X) \rightarrow X$ that swaps $x$ with $y$ and extend by linearity. Then $K_{x,y} = BP$ is continuous and of finite rank (since its image lives in $P(X)$). So it's compact. [I've suppressed in the notation the fact that $K_{x,y}$ depends on a choice of complementary subspace $V$.]
Claim: If there's an $x$ such that $Ax = \lambda x$, then $A=\lambda I$.
Proof: If $x$ and $y$ are linearly independent with $x$ as above, then we have $$Ay=A(K_{x,y}x)=K_{x,y}(Ax)=K_{x,y}(\lambda x)=\lambda y.$$
Now suppose, by contradiction, that we can find an $x$ such that $x$ and $Ax$ are linearly independent (i.e. $x$ is not an eigenvector for $A$). Then we must have $$x = K_{x, Ax}(Ax) = A(K_{x, Ax}x) = A(Ax)=A^2x,$$ so $x+Ax$ is an eigenvector for $A$. But by our claim, if $A$ has an eigenvector, then $A=\lambda I$, contradicting linear independence of $x$ and $Ax$. So all vectors are eigenvectors, and, by our claim, $A=\lambda I$ for some $\lambda$.