Consider the claim:
"Let $X$ be an infinite dimensional Banach space and suppose that $T\in\mathcal B(X)$ is compact. Then the spectrum of $T$ is either a finite set or a sequence converging to zero."
In the sources and references I have available, I have only been able to find the proof of this fact for when we are on an infinite dimensional Hilbert space, $\mathcal H$, and not only a Banach space. How does the proof change when one only has a Banach space structure? And how does one get around the caveats left by reducing the structure from Hilbert to Banach?
It seems to me that the proof in a Hilbert space relies on the use of an orthonormal sequence $\{e_n\}$; is this all that we rely on in the Hilbert space setting? And if it is, how do we rectify it's absence for the Banach space scenario?
Here is a sketch of the ingredients of a proof in the Banach space setting. Throughout $T \in \mathcal{B}(X)$ will be a compact operator on an infinite dimensional Banach space.
It is easy to see that these results will imply your desired result. The first two are fairly straightforward exercises.
The idea for the last lemma is to assume that $\lambda_j$ is an infinite sequence of eigenvalues with $|\lambda_j| > r > 0$. Then for $V_k := \operatorname{span} \bigg( \bigcup_{j = 1}^k \ker(T-\lambda_j) \bigg )$ inductively find a sequence $x_j$ such that $\|x_j\| = 1, x_j \in V_j$ and $\operatorname{dist}(x_j, V_{j-1}) = 1$. This is possible since $\ker(T-\lambda_j) \not \subset V_{j-1}$ and $\dim V_j < \infty$. To conclude, check that $Tx_j$ has no Cauchy subsequence which gives a contradiction.