Prove the $\ell^2$ norm of a linear transformation $A: \mathbb{R}^n \to \mathbb R^n$ is the maximum eigenvalue

277 Views Asked by At

If $A: \mathbb R^n \to \mathbb R^n$ is a linear transformation and $\mathbb R^n$ is equipped with $\lVert \cdot \rVert_2$, prove that $$ \lVert A \rVert := \sup \left\{ \frac{\lVert A \vec{x} \rVert_2}{\lVert \vec{x} \rVert_2} : \vec{x} \in \mathbb R^n, \vec{x} \neq 0\right\} = \max \{ \lvert \lambda_i \rvert : i = 1, \ldots, n \} $$ where $\lambda_i$ is the eigenvalues of a matrix $B$ such that $A(\vec{x}) = B \vec{x}$ for every $\vec{x} \in \mathbb R^n$.

Above is the problem I'm struggling with. I don't know how to show this is true, I searched a lot about it, but didn't understand the concepts! Please help me understand and prove this!

The attached picture is what I have in the paper

1

There are 1 best solutions below

2
On

The transformation $T(1,0)=(1,0), T(0,1)=(1,1)$ is non-singular has only 1 as its eigen value but $||T||=\sqrt 2$. The stated equality holds for symmetric matrices (even singular ones!) but the result claimed is false.