The spectral norm of the matrix A is defined by
$$||A||_2=\bar{\sigma}(A)$$
where $\bar{\sigma}(A)$ is the maximum singular value of A.
Well, I can't undestand why, given two matrices A and B,
$$\bar{\sigma}(A\cdot B)\leq \bar{\sigma}(A)\cdot \bar{\sigma}(B)$$
I can't understarnd which property is involved... It is not the triangular inequality, not the omogeneity property... Thanks for your explanation
https://en.wikipedia.org/wiki/Matrix_norm#Matrix_norms_induced_by_vector_norms
If $\|\cdot\|$ is a norm on $\mathbb C^n$, then it induces a matrix norm by the following: $$ \|A\|:= \mathrm{sup}\left\{ \frac{\|Ax\|}{\|x\|}\ :\ 0\ne x\in\mathbb C^n\right\}.$$ These induced norms clearly satisfy $\|AB\|\le \|A\|\|B\|$ for all square matrices $A, B$.
Now the $\|\cdot\|_2$ norm you defined in terms of the singular value decomposition happens to be the norm induced by the standard Euclidean norm on $\mathbb C^n$; see the linked Wikipedia page. In particular, the $\|\cdot\|_2$ norm satisfies the property you ask for.