If we measure a length and is measured as $12.5$ meters long, accurate to $0.1$ of a meter this means the absolute error is $0.05$m.
The relative error is: $\frac{0.05}{12.5} = 0.004$. This means that the measurement is accurate to $\frac{4}{1000}$ or $0.4$%. or equivalently that for each unit we measure we introduce an error of $0.004$.
Now if we compare the $10^{-50}$ with $10^{-6}$ then the absolute error is: $|10^{-50} - 10^{-6}| = 10^{-6}$.
The relative error then is $\frac{absolute\space error}{measured\space value} = \frac{10^{-6}}{10^{-6}} = 1$
What is the meaning of $1$ here? How is it interpreted and how from a small absolute difference we get something that seems to indicate $100$% error?
Briefly, when the absolute value of the relative error exceeds unity, you can no longer trust the sign. If $T$ is the target value and $A$ is the approximation, then the absolute error is $$E = T-A$$ and the relative error is $$R=\frac{E}{T}.$$ It follows that $$A = T - (T-A) = T - \frac{T-A}{T}T = T-RT= T(1-R).$$ Typically, we do not know the relative error, but we have an upper bound for the absolute value. Now if $$|R|<1,$$ then $A$ and $T$ have the same sign. If $|R|\geq1$, then it is entirely possible that $A$ and $T$ have different sign.
Computing the correct sign is critical in root finding applications. If $f : \mathbb{R} \rightarrow\mathbb{R}$ is continuous and $f(x_1)$ and $f(x_2)$ have different signs, then $f$ has a zero in the interval between $x_1$ and $x_2$. If we cannot trust the computed value of the sign of $f(x_i)$, then we cannot make this determination with certainty.