What is the complexity (in flops, floating-point operations) of taking the $l_2$-norm of vector $\mathbf{v}\in\mathbb{R}^n$ (or $\mathbf{v}\in\mathbb{C}^n$ if a difference exists).
We have the following definition of the $l_2$-norm of $\mathbf{v}$:
$$ \lVert\mathbf{v}\rVert_2= \sqrt{|v_1|^2+\dots+|v_n|^2} $$
where $|v_i|$ is the absolute value of $v_i$ (or complex modulus if $v_i$ complex).
I fount that the square-root is usually considered as one flop which makes this operation cost $2n$ flops ($n$ multiplications (for squares), $n-1$ additions and one square-root).
Is it correct to consider the square-root operation as a flop?
The rule of thumb is that a square root takes time equivalent to about 100 flops. But it is a constant, so can be ignored for large $n$.
In practice, the two norm is not computed this way. The reason is that the squaring of a large element (or a small one) can cause unnecessary overflow or underflow. What is done instead is that the largest entry (in magnitude) is determined (let's say $ v_{\rm max} $ and then each element is divided by that: $ v = \frac{1}{v_{\rm max}} $. Then $ \| v \|_2 = v_{\rm max} \| \frac{1}{v_{\rm max}} v \|_2 $. This increases the cost, but in the end still requires $ O(n) $ time.