From pg. 36 of Shankar's Principles of Quantum Mechanics:
Here in red (which confuses me) Shankar seems to be appealing to this proof:
Lemma: If an n-dimensional square operator has an eigenpair, it necessarily has (n) (not necessarily distinct) eigenvalues.
Lemma proof sketch:
- Suppose $Ω |V\rangle = ω |V\rangle$ (that is assume $Ω$ has at least one eigenpair $|V\rangle$ and $ω$).
- Then $\det(Ω - ω I) = 0$, else $Ω - ω I)$ would be invertible, leading to a contradiction.
- But $\det(Ω - ω I)$ can be expanded to a characteristic polynomial, which when set to 0 contains $n$ roots (a fundamental fact from complex analysis).
- Hence these $n$ (not necessarily distinct) roots must be eigenvalues via (2).
NOTE: The full details of this proof, as actually presented in the Shankar's book, can be found here. I suspect there's a chance I've misunderstood it, though I don't see how?
Problem 1: I am having trouble understanding this argument, since it seems to me this might merely imply that $Ω$ doesn't have any eigenpairs. That is, (4) may simply be false, since this would only imply that (1) is false. Or put differently: the above proof merely seems to me to suggest the claim that "If (1) is true, then (4) must be true", but (1) might not be true!
The proof continues with more confusion for me:
Problem 2: Why must the first column of the basis be $[ω_1, 0, …, 0]^T$? This seems equivalent to saying that
$$ \Omega | \omega_1 \rangle = \omega_1 |1\rangle $$
but I don't see why this must be so. Perhaps it has something to do with the following two facts, but I don't quite see the connection?
- $\Omega | ω_1 ⟩ = ω_1 | ω_1 ⟩$
- $‖ω_1 | ω_1 ⟩‖ = |ω_1| ‖ |ω_1 ⟩‖ = |ω_1| 1 = |ω_1|$


Problem 1: What is left implicit here is that the condition $\det(\Omega - \omega I) = 0$ is in fact equivalent to $\omega$ being an eigenvalue of $\Omega$. Indeed, we have \begin{aligned} \det(\Omega - \omega I) = 0 &\iff \Omega - \omega I\text{ is not an isomorphism} \\ &\iff \Omega - \omega I \text{ is not injective} \\ &\iff \text{ there is a vector $V$ with }(\Omega - \omega I)V = 0 \\ &\iff \text{ there is a vector $V$ with } \Omega V = \omega V \end{aligned} So it is really true that a complex number $\omega$ is an eigenvalue of $\Omega$ if and only if it is a zero of the characteristic polynomial.
Problem 2: The first column being $[\omega_1, 0, \dots, 0]^T$ just means that $\Omega \vert\omega_1\rangle = \omega_1 \vert\omega_1\rangle$. Recall: what does it mean to write a linear transformation as a matrix? It means that for every element $e_i$ of some chosen basis $\{e_1, \dots, e_n\}$, you write each $\Omega e_i$ as \begin{aligned} \Omega e_i = \sum_{j=1}^n \Omega_{ij} e_j, \end{aligned} and the coefficients $\Omega_{ij}$ will be the matrix coefficients appearing in the $i$-the column. In this case, we choose the basis $\{\lvert \omega_1 \rangle, V^1, \dots, V^{n-1}\}$, and we have \begin{aligned} \Omega \vert\omega_1\rangle = \omega_1 \vert\omega_1\rangle = \omega_1 \vert\omega_1\rangle + 0 V^1 + \dots + 0 V^{n-1}, \end{aligned} and thus the matrix coefficients on the first column are $[\omega_1, 0, \dots, 0]$.
Note that there is no mention of a vector $\vert 1 \rangle$.