I am coming from the programming world, and I am currently trying to solve a finding eigenvalues problem in a code. HOwever I find it as a very difficult problem to solve in higher matrices dimensions. For 2x2, and 3x3 matrices I found this wikipedia that shows an algorithm for this problem: https://en.wikipedia.org/wiki/Eigenvalue_algorithm (Direct calculation).
I also read about Jacobi-Davidson solution, but I could not understand how does it works.
The question is what algorithm should I use to calculate the eigenvalues for higher dimensions matrices?
One of the popular methods for finding eigenvalues is the power iteration method.
It is about calculating successive matrix-vector products $$\bf v_{n+1} = Av_n$$ and Starting with a random vector $\bf v_0$, and finishing when element-wise division $$\frac{\bf v_{n+1}}{\bf v_n}$$ produces close to the same value, then this value will be the largest eigenvalue.
If it still has not produced close to same value, then performing a normalization after each step $$\bf v_n = \frac{v_n}{|v_n|}$$
Once you found the largest, then you store the vector $\bf v_n=w_1$, and you restart the algorithm with a new random vector, but this time you start with projecting away the previously known eigenvectors
$$\bf v_0 = v_0 - w_1\frac{(v_0 \cdot w_1)}{\|v_0\|}$$
This way you will be sure that no part of this new random vector will start growing in the direction of the old found eigenvector.