From this article about the singular value decomposition:
Let $A$ be an $n \times d$ matrix and think of the rows of $A$ as $n$ points in $d$-dimensional space. The Frobenius norm of $A$ is the square root of the sum of the squared distance of the points to the origin. The 2-norm is the square root of the sum of squared distances to the origin along the direction that maximizes this quantity.
Can someone help me to understand this remark? Thanks
The first says $$\Vert A \Vert_F^2 = \sum_{k=1}^n \Vert p_k - 0 \Vert_2^2 = \sum_{k=1}^n \sum_{i=1}^d (p_k)_i^2$$ which is true if $(p_k)_i = A_{ki}$.
For the latter a similar thing holds considering $\Vert A \Vert_2^2 = |\lambda_{\max}|$; Mathematically put and implying a "direction" vector to be normalized, it says $$\Vert A \Vert_2^2 = \sup_{\Vert x\Vert_2 = 1} \Vert Ax \Vert_2^2 = \sup_{x\in \mathbb R^d \setminus \{0\}} \frac{\Vert Ax \Vert_2^2}{\Vert x\Vert_2^2}$$ Which coincides with the definition of the $2$-Norm as the induced Operator norm and the last equality holds due to continuity and linearity of the Operator $x\mapsto Ax$.