I know how to prove what the title says using determinants, but let's say I wanted to use another approach. In Axler's Linear Algebra Done Right, he seems to avoid using determinants for proving things about determinants. He successfully proves that every linear transformation $T$ in a complex vector space $V$ has an eigenvalue by roughly the following procedure:
Suppose $\mathbf{v}\in V$, $\mathbf{v}\neq\mathbf{0}$ and $n=\dim V$. Then the list $(\mathbf{v},T\mathbf{v},\dots,T^n\mathbf{v})$ is linearly dependent in $V$ since it contains $n+1$ vectors.
This means that there exist $a_0,\dots,a_n$, not all zero such that
$$a_0\mathbf{v}+a_1T\mathbf{v}+\dots+a_nT^n\mathbf{v}=0.$$
If we let $m$ be the largest index such that $a_m\neq0$, $0<m\leq n$ since the coefficients cannot all be $0$.
By the fundamental theorem of algebra, this expression can be factored into:
$$c(T-\lambda_1I)\dots(T-\lambda_mI)\mathbf{v}=0.$$
Since $\mathbf{v}\neq\mathbf{0}$, at least one of the $(T-\lambda_j)$ terms is not injective.
Could one prove the title of the question in a similar way? I think one could get a factor in the form $(T^2+\alpha T+\beta I)$, but I'm stuck with proving that exactly that factor must be injective. Ideas?