Numerically computing eigenvalues -- what is it useful for?

370 Views Asked by At

Cross-posted on Scientific Computing Stack Exchange

Are there real-world applications that call specifically for eigenvalues rather than singular values?

Top eigenvalue is useful to establish convergence, but what about the rest?

I often see eigendecomposition used as "poor-man's SVD" For instance it's used in Matlab's Lyapunov solver, but that could be reformulated in terms of SVD with greater cost ($22n^3$ instead of $9n^3$, Higham's big six), while gaining numerical stability. Similarly, PCA can be done using SVD.

Picture below: two linear transformations below have the same eigenvalues:

enter image description here enter image description here

Notebook

4

There are 4 best solutions below

2
On BEST ANSWER

The eigenvalues of partial differential operators describing mechanical or electromagnetic systems are related to the resonance frequencies. For example, the frequencies at which a drum or guitar or string instrument vibrates are the square roots of the eigenvalues of the Laplace operator. The frequencies at which a building or bridge sways are the square roots of the eigenvalues of the linear elasticity operator. The frequencies at which an electromagnetic cavity (say, in your microwave oven, or in the particle accelerators used for medical cancer therapy devices) oscillates are the square roots of the eigenvalues of the Maxwell operator. There are many practical applications in which knowing these resonance frequencies is important, typically because you want that a device/instrument/building does or does not have specific resonant frequencies.

In order to compute the eigenvalues of these operators, you "discretize" them to obtain a finite-dimensional matrix, and then you compute the eigenvalues of this matrix. In many cases, these matrices have sizes ranging in the hundreds of thousands to the hundreds of millions.

2
On

One of the most important and widely used applications of eigenvalues specifically as opposed to singular values comes from from dynamical systems. Consider a linear ODE $$\dot{x} = Ax,$$ where $A$ is a diagonalizable $n\times n$ matrix. We then write its eigendecomposition and SVD as $A = P\Lambda P^{-1}$ and $A = U\Sigma V^*$. If we define $y = P^{-1}x$, then we have $$ \begin{aligned} \dot{x} = P\dot{y},&\quad Ax = P\Lambda y \\ \implies \dot{y} &= \Lambda y. \end{aligned} $$ Since $\Lambda$ is diagonal, this is easily solvable and, furthermore, we can determine whether the solution is rgowing, decaying, oscillating, etc. from the complex phase of the eigenvalues. This is something that is very difficult to determine from the singular value decomposition as all of the phase information is contained in the unitary matrices $U$ and $V$. Notice that trying to apply this same trick won't work with the SVD, as it relies on $A$ being similar to the diagonal matrix $\Lambda$, whereas the SVD does not rely on similarity.

This is used all the times in physics and engineering applications, as eigenvalues with real part greater than zero imply that something will be exponentially growing, such as spatial oscillations of a bridge or the reaction rate in a nuclear reactor.

3
On

The energy levels available to a system (e.g. an atom, molecule, material, etc.) are the eigenvalues of the system's Hamiltonian matrix.

The following diagram which is presented to grade 9 (typically aged 13 to 15) students in the Ontario curriculum, shows labels four different energy levels, which correspond to the lowest four eigenvalues of the atom's Hamiltonian matrix:

~~ ~~ ~~ ~~ ~~ ~~enter image description here

Therefore, all of spectroscopy is about eigenvalues and the differences between them.

It is how we know that there's water on Mars, and CO2 on Venus and how we know the composition of stars and how we know the composition of the universe:

~~ ~~ enter image description here

We also use spectroscopy to check for pollutants in fuels, to check whether or not currency is counterfeit, and we use it in medical, geological, and atmospheric/climate applications among many, many other things.

The eigenvalues of the H atom within a non-relativistic model of the universe, are known analytically, but for larger atoms and for molecules, liquids, solids, etc., and even for relativistic modeling of the H atom, we almost always obtain eigenenvalues (energies) using numerical methods. For this exact reason, a chemist by the name of Ernest Davidson came up with one of the best ways to find the lowest eigenvalue of a matrix, and this is called the Davidson method. In only about 3.5 years, the word "eigenvalue" comes up 163 times on MMSE, so you can find a lot of real-world uses of eigenvalues.

0
On

I work primarily in the area of computer graphics, and I needed to compute an eigenvector just a few days ago!

When doing real-time cloth simulation, it might be useful to reconstruct the local coordinate system at a vertex as a rotated initial coordinate system of this vertex, which is essentially a 3D shape matching problem. If we describe the rotation as a quaternion, this reduces to finding the smallest value of a quadratic form on the unit sphere in 4 dimensions, or, equivalently, the smallest eigenvalue of a $4\times 4$ symmetric matrix.

enter image description here