Let $A$ be a $k\times m$ matrix and B be a $m\times n$ matrix, I wonder how to prove the following inequality
$$\|AB\|_F\le\|A\| \|B\|_F,$$
where $\|\cdot\|_F$ is the Frobenius norm (square root of the sum of all squared entries and $\|\cdot\|$ is the 2-operator norm )
Note if $n=1$, i.e when $B$ is a column vector, this just follows from the definition of the operator norm. But I don't know how to deal with the general case. I have thought about using SVD of $A,B$ but don't know how to simplify the LHS. Any approach will be appreciated!
if you know a little spectral theory, you can square both sides and recognize your problem is equivalent to proving
$\text{trace}\big(XY\big) \leq \lambda_1 \cdot \text{trace}\big(Y\big)$
here $\lambda_1$ is the maximal eigenvalue of $X$ and $X, Y$ are Hermitian positive semidefinite. $X$ is unitarily diagonalizable by $Q$ so
$\text{trace}\Big(XY\Big) $
$=\text{trace}\Big(Q\Lambda Q^*Y\Big) $
$=\text{trace}\Big(\Lambda \big(Q^*YQ\big)\Big) $
$=\text{trace}\Big(\Lambda Z\Big) $
$=\sum_{k} \lambda_k \cdot z_{k,k}$
$\leq \sum_{k} \lambda_1 \cdot z_{k,k}$
$= \lambda_1 \cdot \text{trace}\Big(Z\Big)$
$= \lambda_1 \cdot \text{trace}\Big(Q^* Y Q\Big)$
$= \lambda_1 \cdot \text{trace}\Big(Y\Big)$
selecting $X:= A^*A$ and $Y:= BB^*$ completes the proof