Do we ever distinguish between $0^+$ and $0^-$?

59 Views Asked by At

I know the IEEE standard does for numerical precision reasons, but I’m asking mathematically. I ask because I’ve been thinking about how:

(VAGUE LANGUAGE BEGINS HERE) the sum of uncountably many zeros can be non-zero. For example, the length of an interval, or probability for continuous random variables, how $P(\{\omega\}) = 0$ for all $\omega$, but $P(\{\omega | 0 \le \omega \le 1\}) > 0$. Perhaps it’s because the $0$ in $P(\{\omega\}) = 0$ was always somehow $0^+$, something more than $0$, because it was $\frac{1}{\infty}$, a something divided by infinity, so increasingly closer to 0, but from the right-hand side. (VAGUE LANGUAGE ENDS HERE)

I’m aware there’s no real paradox; I haven’t defined uncountable sums, so who’s to say anything about their behaviour? However, I do wonder if there is something to the idea of tracking whether we got to $0$ through a limit that approached from the left, versus from the right (or in some other direction, or multiple directions), and use that information to make predictions about whether the ‘uncountable sum’ (again, I know I haven’t defined it) will be positive or negative.