So pretty much everyone can agree that the following is undefined:
$$\frac{n}{0},~~(n\neq 0).$$
With the reasoning being that as you approach dividing by zero from a positive direction the number approaches infinity, while if you approach zero from a negative direction the number approaches negative infinity.
I am wondering, however, if the absolute value of a number divided by zero is defined. Can anyone give me a good reason why the following would or wouldn't be true:
$$ \left|\frac{n}{0}\right| = \infty ,~~(n\neq 0).$$
There are 3 ways of interpreting the $\infty$ symbol. First, you you can extend the real numbers $\mathbb R \text { to } \mathbb R \cup \{\infty \} $ and then extend the order relation < and the + and x operations to this new set. What you get is perhaps a bit nicer way of interpreting some limit ideas but at the cost of destroying much of the algebraic structure of the real numbers. Your new object $\mathbb R \cup \{\infty \}$ will not be a field. My personal opinion is that the algebraic cost of this approach far outweighs any possible benefit. A variation of this approach is to introduce two new symbols $\infty \text { and }- \infty$. The resulting object is even worse (if possible) alebraically. The second way is to misappropriate the real projective line $$\mathbb P^1\mathbb R=\{(a:b)|a \text { and }b \text { are not both 0 }\}$$ where $$(a:b)=\{(ta,tb)|t \ne 0\}.$$Since the map from $f:\mathbb R \to \mathbb P^1\mathbb R $ defined by $f(a)=(a:1)$ is injective, since (1:0) is the only member of $\mathbb P^1\mathbb R$ not in the image of f and since $f^{-1}(a:b)=a/b \text { if }b \ne 0$ it is tempting to write $\infty$ as a symbol for (1:0) and then to write 0 and $\infty$ as each other's reciprocals. My personal opinion is that these temptations should be firmly resisted.The resulting algebraic object is not a group for multipication. My own view is that, not only are the first and second approaches mathematically unsound unless handled with extreme care, they are pedagogically very dangerous. Hazy remnants of these ideas are floating around in the mathematical consciousness of beginning undergraduates, leading them to try to add $\infty$ to $-\infty$. The third approach, and, to my mind, by far the best, is to treat the symbolism $\lim_{n \to 0^+}\frac {1}{n}=\infty$ as simply an abbreviation for "for every $M$>0 there exists $\epsilon >0$ such that $\frac {1}{n}>M \text { whenever } 0<n<\epsilon$ ."