A matrix norm for which $\lvert \lvert A \rvert \rvert = \lvert \lvert \ \lvert A \rvert \ \rvert \rvert$ is called an absolute norm, having denoted by $\lvert A \rvert$ the matrix of absolute value of the entries of $A$. Prove that $\lvert \lvert \cdot \rvert\rvert_{2}$ is not an absolute norm.
Attempt at solution
[Definition] $$\lvert \lvert A \rvert\rvert_{2} = \max\limits_{i = 1,\dots,n} \sigma_{i}(A),$$ where $\sigma_{i}$ represent the singular values of $A$, that is the square roots of the eigenvalues of $A^T A$.
[Definition] $$ \lvert \lvert A \rvert \rvert_{2} = \max\limits_{\lvert\lvert x \rvert \rvert_{2}=1} \lvert \lvert Ax \rvert \rvert_{2}.$$ [Lemma]
If $A \in \mathbb{R}^{m \times n}$, then the eigenvalues of $A^T A$ are non-negative.
I suppose one could find a counter example, but that looks computationally heavy and I was not able to think of one.
\begin{gather*} \langle(A^T A)u , u \rangle = \lambda \lvert \lvert u \rvert \rvert_{2}^{2} \implies \lambda = \frac {\langle(A^T A)u , u \rangle}{\lvert \lvert u \rvert \rvert_{2}^{2}} = \frac{\lvert \lvert A u \rvert \rvert_{2}^{2} }{\lvert \lvert u \rvert \rvert_{2}^{2}}. \end{gather*} Let us consider $A^T A$ and $\lvert A\rvert^T \lvert A \rvert$. Suppose they share the same eigenvector $u$, then they should share the same eigenvalue. Then, $$ \lambda_{1} = \frac{\lvert \lvert A u \rvert \rvert_{2}^{2} }{\lvert \lvert u \rvert \rvert_{2}^{2}} , \qquad, \lambda_{2} = \frac{\lvert \lvert \ \lvert A \rvert \ u \rvert \rvert_{2}^{2} }{\lvert \lvert u \rvert \rvert_{2}^{2}}, $$ for $\lambda_{1} = \lambda_{2}$ , $Au$ must be equal to $\lvert A \rvert u$ which is not always the case. Thus, $$ \lvert \lvert \ \lvert A \rvert \ \rvert \rvert_{2} = \max\limits_{i = 1,\dots,n} \sigma_{i} (\lvert A \rvert ) \neq \max\limits_{i = 1,\dots,n} \sigma_{i} (A).$$
I don't think my solution is correct. Anyone can confirm or has any clue on how to solve this problem ?