I am doing a course in Convex Optimization where I learned about the equivalence of norms, but there was no mention of its importance or about the scenarios where it can come in handy. The definition I am referring to here is as follows.
Two norms $||x||_a$ and $||x||_b$ are said to be equivalent if $\exists \alpha, \beta$ s.t $\alpha$, $\beta$ > 0
$\alpha ||x||_b \leq ||x||_a \leq \beta ||x||_b$
On $\mathbb R^{m}$, all norms are equivalent in the sense that if $\| \|_{a}$ and $\| \|_{b}$ are norms, then there are constants $C_{1}$ and $C_{2}$ such that for every $x$ in $\mathbb R^{m}$,
$C_{1} \| x \|_{a} \leq \| x \|_{b} \leq C_{2} \| x \|_{a}$
An easy consequence of this is that if $x^{(n)}$ is a sequence of vectors such that
$\lim_{n \rightarrow \infty} \| x^{(n)} \|_{a}=0$
then
$\lim_{n \rightarrow \infty} \| x^{(n)} \|_{b}=0$.
This means that if you want to prove that $\| x^{(n)} \|_{a}$ converges to 0, you can pick any convenient norm and use it in the proof. For example, if you want to show that
$\lim_{n \rightarrow \infty} \| x^{(n)} \|_{2}=0$,
you can start by showing that for each component $x_{i}$,
$\lim_{n \rightarrow \infty} | x^{n}_{i} | = 0$,
then
$\lim_{n \rightarrow \infty} \sum_{i=1}^{m} | x^{(n)}_{i} | = 0$
or
$\lim_{n \rightarrow \infty} \| x^{(n)} \|_{1}=0$
and from there, you can use the equivalence of norms to conclude that
$\lim_{n \rightarrow \infty} \| x^{(n)} \|_{2}=0$.
Because of the equivalence of norms in $\mathbb R^{m}$, we say that
$\lim_{n \rightarrow \infty} x^{(n)}=x^{*}$
if there is some norm in which
$\lim_{n \rightarrow \infty} \| x^{(n)}-x^{*} \|=0$.
It simply doesn't matter what norm is used.