Is it possible to isolate an eigenvalue for a general case with the equation $A\mathbf{x} = \lambda_i\mathbf{x}$?

98 Views Asked by At

I know the derivation is probably online in a lot of places; but perhaps I'm wording my question wrong, I cannot find an example online that satisfies what I am looking for.

My question is given a $n \ x \ n$ symmetric matrix, what would be the generalization to isolate $\lambda_i$ from $A\mathbf{x} = \lambda_i\mathbf{x}$ to $\lambda_i$ = ...

Most examples I've seen either used Spectral theorem or the values of matrix and eigenvectors are already given, is there an intuitive explanation to this question?

1

There are 1 best solutions below

6
On

Take the equation and multiply by $\mathbf{x}^T$ (normalized), then $\mathbf{x}^TA\mathbf{x} = \lambda_i \mathbf{x}^T\mathbf{x} = \lambda_i$

Note Scalar-vector multiplication is commutative

\begin{equation} \left[ \begin{matrix} x_1 & x_2 \end{matrix} \right] \lambda_i \left[ \begin{matrix} x_1 \\ x_2 \end{matrix} \right] = \left[ \begin{matrix} \lambda_i x_1 & \lambda_i x_2 \end{matrix} \right] \left[ \begin{matrix} x_1 \\ x_2 \end{matrix} \right] = \lambda_i \left[ \begin{matrix} x_1 & x_2 \end{matrix} \right] \left[ \begin{matrix} x_1 \\ x_2 \end{matrix} \right] \end{equation}

General idea

Suppose $A$ has an orthonormal basis of eigenvectors $\mathbf{x}_1,\mathbf{x}_2, \dots, \mathbf{x}_n$ corresponding to distinct eigenvalues $\lambda_1>\lambda_2>\cdots>\lambda_n >0$.

Every unit vector $\mathbf{x}$ has an expansion $\mathbf{x} = \sum \alpha_i \mathbf{x}_i$, where $\sum \alpha_i^2 = 1$.

Then $A\mathbf{x} = \sum \alpha_i A\mathbf{x}_i = \sum \alpha_i \lambda_i\mathbf{x}_i$ and $\mathbf{x}^TA\mathbf{x}= \sum \alpha_i^2 \lambda_i$

The maximum value of this expression occurs when $\alpha_1=1$ and all other $\alpha_i$ are 0. If we restrict to the complement of $\mathbf{x}_1$ (i.e. requiring $\alpha_1=0$), then the max occurs for $\alpha_2 = 1$, etc.