Let $(X,||\cdot||)$ be a Banach space on $\mathbb{C}$ and $\mathcal{L}(X)$ the set of linear, continuous functions from $X$ to itself. For $T\in\mathcal{L}(X)$, define the norm $||T||_{\mathcal{L}(X)}:=\sup_{v\neq 0}\frac{||T(v)||}{||v||}$ and the continuous spectrum $\sigma(T):=\{\lambda\in\mathbb{C}\mid A-\lambda I\,\text{ has no inverse}\}$. Define the sets: $$GL(X):=\{T\in\mathcal{L}(X)\mid T\text{ is invertible}\}$$ $$\mathcal{H}(X):=\{T\in GL(X)\mid T\text{ is hyperbolic}\}$$ (where hyperbolic means $|\lambda|\neq 1$ for all $\lambda\in\sigma(T)$)
Prove that: (1) $GL(X)\subset\mathcal{L}(X)$ is open and (2) $\mathcal{H}(X)$ is open, dense in $GL(X)$.
(1) Take $A\in GL(X)$ and the open ball $B_r(A)$, where $r:=1/||A^{-1}||$. If $B\in B_r(A)$, then $||I-BA^{-1}||\leq||A-B||\cdot||A^{-1}||<r||A^{-1}||=1$. The series $\sum_{n=0}^\infty(I-BA^{-1})^n$ is therefore convergent, so $BA^{-1}$ is invertible with $(BA^{-1})^{-1}=(I-(I-BA^{-1}))^{-1}=\sum_{n=0}^\infty(I-BA^{-1})^n$. Therefore, $B$ is invertible $\Rightarrow B_r(A)\subset GL(X)$. $_\blacksquare$
(2) We prove $GL(X)\setminus\mathcal{H}(X)$ is closed in $GL(X)$. If $\{A_n\}_{n\in\mathbb{N}}\subset GL(X)\setminus\mathcal{H}(X)$ a Cauchy sequence converging to $A\in GL(X)$, we need to prove $A\notin \mathcal{H}(X)$. By definition, for all $n$ there is $\lambda_n\in \mathbb{C}$ such that $|\lambda_n|=1$ and $A_n-\lambda_nI\notin GL(X)$. Since $\mathbb{S}^1:=\{\lambda\in\mathbb{C}\mid |\lambda|=1\}$ is compact, there is a convergent subsequence $\{\lambda_{i_n}\}_{n\in\mathbb{N}}$ converging to $\lambda\in\mathbb{S}^1$, so $\lim_{n\to\infty} A_{i_n}-\lambda_{i_n}I=$ $A-\lambda I$. Since $A_{i_n}-\lambda_{i_n} I\notin GL(X)$ (which is open), $A-\lambda I\notin GL(X)$, so $A\in GL(X)\setminus\mathcal{H}(X)$.
To prove $\mathcal{H}(X)$ is dense in $GL(X)$, my idea is this: let $A\in GL(X)\setminus\mathcal{H}(X)$ and $\epsilon>0$ arbitrary, then there exist $z\in\mathbb{C}$ with $|z|<\epsilon$ such that $A+zI\in\mathcal{H}(X)$. I feel like this could work, but I don't know how to guarantee $\sigma(A+zI)\cap\mathbb{S}^1=\emptyset$.
I didn't take a course about the spectral theorem yet but let me try to answer your question. I appreciate to be corrected if I am mistaken.
Your idea can work : To prove $\mathcal{H}(X)$ is dense in $GL(X)$, let $A\in GL(X)\setminus \mathcal{H}(X)$ and $\epsilon> 0$ be arbitraries, then $\exists \lambda_0\in \sigma(A): |\lambda_0|= 1$ , we are looking for a $T\in \mathcal{H}(X)$ ( i.e. $\forall \lambda\in \sigma(T):\ |\lambda|\neq 1$ ) , with $\|A-T\|< \epsilon$
Edit : (Thanks to Wraith1995)
We will use $\sigma(T)=\overline{\sigma_p(T)}$, where the $\sigma_p(T)$ is the point spectrum, to 'avoid' an open set containing $S^1$, say the circular sector $S= \{z\in \mathbb{C}:\ 1-\frac{\epsilon}{3}<|z|<1+\frac{\epsilon}{3}\}$
$\{b_i\}_{i\in I}$ is taken exactly as the basis formed by extending the set of 'eigenvectors' of $A$ such that $Ab_j= \lambda_jb_j \ \forall \lambda_j\in \sigma(A)$ and $j\in J\subset I$
$D$ is a 'diagonal' operator with respect to $\{b_i\}_{i\in I}$ of $X$ , where $Db_i= z_ib_i$ for each $i\in J$ and $Db_l= 0 \ \forall l\in I\setminus J$
To guarantee $\sigma(A+D)\cap S^1= \emptyset$ , we have $\sigma_p(A+D)= \{\lambda_i+z_i\big|\ i\in J \}$ and $$\sigma(A+D)= \overline{\{\lambda_i+z_i\big|\ i\in J \}}$$ For each $i\in J$ to guarantee that $|\lambda_i+z_i|\notin S$ we can choose each $z_i$ to have a module $|z_i|>\frac{\epsilon}{3}$ and the same argument as $\lambda_i$ if $|\lambda_i|>1$ , otherwise when $|\lambda'_i|<1$ we pick $z_i$ to have the 'opposite' argument that is $arg(z_i)= arg(\lambda'_i)+\pi$ , so that the closure doesn't intersect $S^1$ as desired.
Of course $\epsilon$ is taken to be very small, say $<\frac{1}{7}$