I have a 4x4 covariance matrix and want to find the eigenvalues. I know part of the process is to find the determinant:
$$\tiny{\begin{align}\begin{vmatrix} 3.33−\lambda & −1.00 & 3.33 & 33.00 \\ −1.00 & 1.58−\lambda & −1.92 & −13.92 \\ 3.33 & −1.92 & 62.92−\lambda & −23.42 \\ 33.00 & −13.92 & −23.42 & 398.92−\lambda \end{vmatrix} &= (3.33 - \lambda)\begin{vmatrix}1.58 - \lambda & -1.92 & -13.92 \\ -1.92 & 62.92 - \lambda & -23.42 \\ -13.92 & -23.42 & 398.92 - \lambda\end{vmatrix} -(-1) \begin{vmatrix}-1.00 & -1.92 & -13.92 \\ 3.33 & 62.92 - \lambda & -23.42 \\ 33.00 & -23.42 & 398.92 - \lambda \end{vmatrix}\\ &+ 3.33 \begin{vmatrix}-1.00 & 1.58 - \lambda & -13.92 \\ 3.33 & -1.92 & -23.42 \\ 33.00 & -13.92 & 398.92 - \lambda\end{vmatrix} - 33 \begin{vmatrix}-1.00 & 1.58 - \lambda & -1.92 \\ 3.33 & -1.92 & 62.92 - \lambda \\ 33.00 & -13.92 & -23.42\end{vmatrix}\end{align}}$$
But the computation is complicated, since each 3x3 needs to be broken down into minor matrices as well. I'm working by hand and worry when I encounter higher dimensional matrices. Is there a faster method to find the determinant, something I haven't learned yet?
Unfortunately there are very few tricks for non-sparse matrices. Cofactor expansion is tried and true, and basically all you can reliably do by hand. If this is for an experiment, and you can settle on there being error and noise and such, there are iterative methods that will converge to the eigenvalues that you can find here.