Proving $\lambda_{\text{min}}\leq R(x)$ without Spectral Theorem

250 Views Asked by At

Given a real-symmetric (or Hermitian), positive definite matrix $A$, it is well known that: $$\lambda_{\min}\leq\dfrac{(x,Ax)}{(x,x)}. \tag{1}$$

This is a direct consequence of the min-max theorem and also easily proved by the fact that such an $A$ has orthonormal eigenbasis. But is there any way to prove this without invoking the Spectral theorem or SVD decomposition or anything that is similarly powerful?

The best I could do was: $$\dfrac{(x,Ax)}{(x,x)}\geq \lambda\cdot\dfrac{(x,v)^2}{(v,v)(x,x)} \tag{2}$$ where $\lambda$ and $v$ are an arbitrary eigenpair of $A$, which is weaker than $(1)$ by Cauchy-Schwartz.

2

There are 2 best solutions below

2
On BEST ANSWER

Let $m = \inf_{x\ne 0}\frac{\langle Ax,x\rangle}{\|x\|^2} = \inf_{\|x\|=1}\langle Ax,x\rangle$. We claim that $\lambda_{\text{min}} \le m$.

It suffices to show that $m$ is indeed an eigenvalue for $A$.

For that, consider a positive definite matrix $B \ge 0$ and recall that $$\|B\| = \sup_{\|x\| = 1} |\langle Bx,x\rangle| = \sup_{\|x\| = 1} \langle Bx,x\rangle$$

Pick a sequence $(x_n)_n$ of vectors on the unit sphere such that $\|B\| = \lim_{n\to\infty} \langle Bx_n, x_n\rangle$. We have $$\|Bx_n - \|B\|x_n\|^2 = \|Bx_n\|^2 + \|B\|^2 - 2\|B\|\langle Bx_n, x_n\rangle \le 2\|B\|(\|B\| - \langle Bx_n, x_n\rangle ) \xrightarrow{n\to\infty} 0$$

so $\lim_{n\to\infty} \|Bx_n - \|B\|x_n\| = 0$. We conclude that $B - \|B\|I$ is not bounded from below so it cannot be invertible. Hence $\|B\|$ is an eigenvalue of $B$.

Now consider $B = \|A\|I - A \ge 0$. We have $$\|B\| = \sup_{\|x\| = 1} \langle Bx,x\rangle = \|A\| - \inf_{\|x\| = 1} \langle Ax,x\rangle = \|A\| - m$$

Above we showed that $\|B\| = \|A\|-m$ is an eigenvalue of $B = \|A\|I - A$ so $m$ is an eigenvalue of $A$.


In fact, we have $\lambda_{\text{min}}= m$.

To see this, notice that for any eigenvalue $\lambda$ of $A$ holds $\lambda \ge m$. Indeed, consider $\lambda = m-\varepsilon$ for some $\varepsilon > 0$.

Then for any vector $x$ we have $$\langle (A-\lambda I)x,x\rangle = \langle Ax,x\rangle - \lambda\langle x,x\rangle \ge (m-\lambda)\|x\|^2 = \varepsilon|x\|^2$$

so $$\|(A-\lambda I)x\|\|x\| \ge |\langle (A-\lambda I)x,x\rangle|\ge \varepsilon\|x\|^2$$

Hence $A-\lambda I$ is bounded from below and thus injective. Therefore $\lambda$ cannot be an eigenvalue.

0
On

$$\text{Let} \ \ \ R(X) = \frac{\langle X, AX \rangle}{\langle X, X \rangle},$$

which (by setting $U=X/\|X\|$, i.e. restricting our attention to the unit sphere)is the same as

$$R(U)=\langle U, AU \rangle.$$

There is a classical result

Proposition : For any $U$ belonging to the unit sphere, $R(U)$ is a barycenter (normlized weighted mean) with positive coefficients of the eigenvalues of $A$.

Corollary $R(U)$ is necessary inside the range $[\lambda_{min},\lambda_{max}]$.

As I haven't seen it explained in a simple, non allusive way, I prove it again :

Proof : consider an eigendecomposition :

$$A=P^TDP \ \ \text{with} \ \ D=diag(\lambda_1,...\lambda_n)$$

Then : $$R(U)=\langle U, P^TDPU \rangle =\langle \underbrace{PU}_V, DPU \rangle=\langle V, DV \rangle \tag{1}$$

Letting $v_1,v_2,... v_n$ be the coordinates of $V$, (recall that $v_1^2+v_2^2+...+v_n^2=1$ because $U$ belongs to the unit sphere), we can write :

$$R(U)=v_1\lambda_1v_1+ v_2\lambda_2v_2+\cdots + v_n\lambda_nv_n$$

i.e.,

$$R(U)=v_1^2\lambda_1+ v_2^2\lambda_2+\cdots + v_n^2\lambda_n$$

which is the looked for weighted average of the $\lambda_k$ by positive weights $v_k^2$ whose sum is $1$.