Supposing symmetric matrix $A_{n\times n}$, how do I know that there are $n$ eigenvectors of $A$?
By way of trying to communicate context, I've spent a rather unproductive 5 or so hours watching various YouTube videos on the spectral theorem, orthogonal diagonalization, algebraic-bounding-geometric multiplicity, and thumbing through Anton's Elementary Linear Algebra (4.6 Change of Basis, 5.1 Eigenvalues and Eigenvectors, 5.2 Diagonalization, 6.4 Gram-Schmidt Process, 7.1 Orthogonalization, 7.2 Orthogonal Diagonalization, and 9.4 Singular Value Decomposition). (This video I found the most helpful, but didn't quite understand why $A$ defines an $(n - 1)\times (n - 1)$ matrix)
I'm sure the answer to my question is in there somewhere, but something isn't quite clicking.
I understand:
- that $A = A^T$, and alternatively-stated that $\langle A\vec{x},\vec{y} \rangle = \langle \vec{x},A\vec{y}\rangle$.
- that, if $\vec{x}$ and $\vec{y}$ are eigenvectors of $A$, and if $\lambda \neq \mu$, then $\lambda\langle \vec{x},\vec{y} \rangle = \langle \lambda\vec{x},\vec{y} \rangle = \langle A\vec{x},\vec{y} \rangle = \langle \vec{x},A\vec{y}\rangle = \langle \vec{x},\mu\vec{y}\rangle = \mu\langle \vec{x},\vec{y}\rangle = 0$
- I don't understand how I know there's a next eigenvector of $A$, or how I would know that $\lambda \neq \mu$.
- that, from the fundamental theorem of algebra, every square matrix must have at least one eigenvector (since $det(\lambda I - A) = 0$ must have at least one solution.
- that I can iteratively generate, through the Gram-Schmidt process, an orthonormal basis given a set $B$ that forms a basis. That is, if I know that I have $n$ eigenvectors, I know through G-S that I can generate an orthonormal basis for the eigenspace wherever I have repeated roots.
I don't yet know what Hermitian, unitary, and conjugate transpose mean.
I also don't understand how I know that there exists a symmetric matrix of size $(n-1)\times (n-1)$, such that I would simply be able to presume that another eigenvector has to exist in or come out of that iteratively-smaller matrix.