I'm hoping that someone can help me fix my proof for the following theorem:
Given a $n \times n$ symmetric matrix $A$,
$$ \max_{x : ||x||_2 = 1} x^{\top}Ax = \max \lambda(A), $$
where $\max \lambda(A)$ is the maximum eigenvalue of $A$.
My proof attempt:
Let $A$ be a symmetric matrix with eigenvalue decomposition $A = > V\Lambda V^{\top}$. Taking the square root of $A$, we have that $A^{1/2} = V\Lambda^{1/2} V^{\top}$. Thus,
$$ x^{\top}Ax = x^{\top}A^{1/2}A^{1/2}x = > x^{\top}(A^{1/2})^{\top}A^{1/2}x = ||A^{1/2}x||_2^2. $$
Recall now that $||A^{1/2}x||_2^2$ has maximum $\max > \lambda(A^{\top}A)$ (the largest eigenvalue of $(A^{1/2})^{\top}A^{1/2} = A$), since it reduces to maximizing a spectral norm.)
I realize that this proof falls apart very early on, since when I take the square root of $A$, I am assuming that it is positive semidefinite (otherwise we couldn't take the square root of $\Lambda$ over $\mathbb{R}$).
Is there a small fix that I can apply here? Or better yet, is there a more concise proof altogether?
Yes and yes.
The fix would be to add a multiple of the identity such that $A+cI$ is positive definite, since this just shifts the claimed equation by $c$ on both sides. (I didn't check your proof in detail; it seems you're missing square roots where it says $\lambda(A^\top A)$?)
The more concise proof would be to write
$$ x^\top Ax=x^\top V\Lambda V^\top x=(V^\top x)^\top\Lambda V^\top x=\sum_i\Lambda_i(V^\top x)_i^2 $$
and note that since $V^\top$ is unitary, $V^\top x$ runs through all unit vectors, and the sum is clearly maximized for $(V^\top x)_i=\delta_{i,i_\text{max}}$.