Generalization of a symmetric eigenvalue problem for linear operators on a Hilbert space

84 Views Asked by At

Let $A\in\mathbb R^{n\times n}$. Which assumptions on $A$ do we need and how can we show that $$\max_{\substack{z\in\mathbb R^n\\|z|=1}}\langle Az,z\rangle\tag1$$ is attained at the unit eigenvector $z_{\text{max}}$ associated with the largest eigenvalue $\lambda_{\text{max}}$ of the symmetric matrix $A+A^T$ and the optimal objective value is the logarithmic norm of $A$?

A reference would be enough. What I would like to know further: Is there a generalization of this result to linear operators on a Hilbert space?

1

There are 1 best solutions below

5
On BEST ANSWER

You have \begin{align} \langle Az,z\rangle&=\tfrac12\,\left(\langle Az,z\rangle+\langle Az,z\rangle\right) =\tfrac12\,\left(\langle Az,z\rangle+\langle z,A^Tz\rangle\right)\\ &=\tfrac12\,\left(\langle Az,z\rangle+\langle A^Tz,z\rangle\right) =\tfrac12\,\langle (A+A^T)z,z\rangle.\\ \end{align} Now since $A+A^T$ is symmetric, the Spectral Theorem gives us that $$\tag1 A+A^T=\sum_j\lambda_j\,P_j, $$ where $\lambda_1\geq\lambda_2\geq\cdots\geq\lambda_n$ are the eigenvalues of $A+A^T$ and the $P_j$ are pairwise orthogonal rank-one projections. Then $$\tag2 \langle (A+A^T)z,z\rangle=\sum_j\lambda_j\langle P_jz,z\rangle. $$ The numbers $\langle P_1z,z\rangle,\ldots,\langle P_nz,z\rangle$ are non-negative and add to $1$, so $(2)$ can be seen as a convex combination of $\lambda_1,\ldots,\lambda_n$.

The only difference when you go infinite-dimensional is that instead of $\max$ you need $\sup$. Thus $$ \max_{\|z\|=1}\langle Az,z\rangle=\tfrac12\,\lambda_1, $$ and the max is attained at the eigenvector $z_1$ corresponding to $\lambda_1$.

When you go infinite-dimensional, you find two problems: first, the max need not exist, so you have to consider a supremum instead. More importantly, many operators have no eigenvalues.