Prove that the first eigenvector is an optimal solution to the optimization problem:

1.3k Views Asked by At

$A \in \mathbb{R}^{d \times d}$ is symmetric and positive semidefinite with eigenvalues $\lambda_1 > ... > \lambda_d$ and corresponding set of orthonormal eigenvectors $u^1, u^2, ..., u^d \in \mathbb{R}^d$. Prove that $u^1$ solves:

$\max_{w \in \mathbb{R}^d: ||w||_2 = 1} w^TAw$

2

There are 2 best solutions below

3
On

You may use the spectral theorem to rewrite $A$ as $Q \lambda Q^T$ where $Q$ is orthonormal and $\lambda$ is diagonal made up of $\lambda_1, ... , \lambda_d$. You can then rewrite your objective as $w^T Q \lambda Q^T w = || \sqrt{\lambda} Q^T w||^2$, or with a change of coordinates $x = Q^Tw$, you get the objective $||\sqrt{\lambda} x||^2 = \sum \lambda_i x_i^2$ which is maximized with $x_1 = 1$ and the others equal to zero.

If you do not have "access" to the spectral theorem, you can use Lagrange multipliers. I think you should arrive at the equation $c w_{opt} = A w_{opt}$ for some constant c, which is nothing but the definition of an eigenvector. Knowing that the optimal input must be an eigenvector, then it follows that you can't beat an eigenvector $v$ the largest eigenvalue and of norm 1 which yields $v A v^T = v \lambda_1 v^T = \lambda_1 ||v||^2 = \lambda_1.$

0
On

Every $w$ with $\|w\|_2=1$ can be written as $c_1 u^1 + \cdots + c_d u^d$ for some scalars $c_1, \ldots, c_d$ satisfying $c_1^2 + \cdots + c_d^2 = 1$.

Then $$w^\top A w = \sum_{i,j} (c_i c_j)((u^i)^\top A u^j) = \sum_{i,j} (c_i c_j)\lambda_j ((u^i)^\top u^j) = \sum_j c_j^2 \lambda_j.$$ How would you choose $c_1, \ldots, c_d$ to maximize the right-hand side under the constraint $c_1^2 + \cdots + c_d^2=1$?