Is there a quick way to give a raw estimation of an eigenvector/eigenvalue of a matrix? By "quick" I mean some method which can be computed without a computer or paper and pencil...something you could do in your head
A quick way to estimate eigenvector/eigenvalue of a matrix
5.5k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
For a positive matrix, there is indeed a way to estimate a positive eigenvector from the matrix, the eigenvector associated to the eigenvalue described in the answer of Vladhagen.
To estimate a positive column eigenvector of a positive matrix, use the fact that the true ray of positive column eigenvectors is contained in the convex hull of the rays determined by the columns. So, for instance, any single column gives an estimate. Summing the columns gives, perhaps, a better estimate. From Vladhagen's second example I get the column sum $$\begin{pmatrix}15\\11\\11\end{pmatrix} $$ Now, one trouble with insisting on no calculation at all is that eigenvectors are usually given in normalized fashion, e.g. the $L^2$ normalization where you divide out by the $L^2$ norm which is the length of the vector. So I'll go on to do that with this example, getting $$\begin{pmatrix} .694 \\ .509 \\ .509 \end{pmatrix} $$ This compares with the eigenvector calculated to 3 decimal places on some random on-line calculator that I found: $$\begin{pmatrix} .698 \\ .483 \\ .528 \end{pmatrix} $$
If you want a better eigenvector estimate, square the matrix before doing the calculation. Or cube it … As you take higher powers going to infinity, this is guaranteed to converge, by the Perron Frobenius Theorem, which is an example of what earlier comments call "iterative methods".
Some methods:
If the row or columns sums are all the same (say equal to $r$), then $(1,1,\ldots,1)$ is an eigenvector associated with $\lambda = r$.
Also, Geršgorin's theorem tells us some good estimates, especially if the off diagonal entries are small.
If your matrices have all positive entries, then the largest eigenvalue is positive (and real) and is bounded by the min/max column and row sums.
Some examples: $$\begin{pmatrix}1 & 2& 8\\3&7&1\\5&2&4\end{pmatrix}$$ must have 11 as an eigenvalue, since the rows all sum to the same thing.
Another example of estimation: $$\begin{pmatrix}5 & 2& 8\\3&7&1\\5&2&4\end{pmatrix}$$
has largest eigenvalue between 11 and 13 (inclusive), which is not at all obvious.
I use this trick of min/max bounds using row sums of matrices with positive entries quite often in my research.
The gist is, for "smallish" matrices, getting estimates of eigenvalues in your head is not too hard sometimes.
Another thing to look into would be "Subadditivity of inertia."
The number of positive eigenvalues of a matrix $A$ is bounded by the sum of the number of positive eigenvalues of matrices who sum to $A$. The same can be said for negative eigenvalues of $A$.
These are tools I find to be helpful. They are not meant to be an all encompassing treatise on eigenvalue estimation.