A Quick Note
I know there is a slough of related questions on stack exchange, but none of them really seem to answer my question. This post is the closest in relationship to my question, but the answer simply expresses a high level mathematical explanation, and not an example I can teach my kids. Growing up, my school always taught that division by zero was undefined or not allowed, but never really explained why, or how this was true.
The proposed duplicate has a very good answer, that I understand, however I'm not so sure my kids would understand that answer. The accepted answer will have to be understood by children under age 10 with a minimal working knowledge of multiplication and division.
Getting Started
The other day, I was working on a project at home in which I performed division by zero with a double precision floating point number in my code. This isn't always undefined in the computer world and can sometimes result in $\infty$. The reason for this is clearly explained in IEEE 754 and quite thoroughly in this Stackoverflow post:
Division by zero (an operation on finite operands gives an exact infinite result, e.g., $\frac{1}{0}$ or $\log{0}$) (returns ±$\infty$ by default).
Now, this got me thinking about basic arithmetic and how to prove each operation, and I created a mental inconsistency between multiplication and division.
Multiplication
As this is an important part of the thought process that lead me down this mental rabbit hole, I am including the elementary explanation of multiplication.
- If I place $10$ marbles on my desk, $3$ times, I have placed $30$ marbles on my desk.
- This is expressed as $10 \cdot 3 = 30$ and is true.
- If I place $10$ marbles on my desk, $0$ times, I have placed $0$ marbles on my desk.
- This is expressed as $10 \cdot 0 = 0$ and is true.
These two scenarios are true no matter what numbers are used.
Division
This is where things took an unexpected turn in my mind.
Let's say that I am a wandering saint and I have 50 apples. I want to help the hungry people of the world so I give my apples away freely. Now, let's handle two similar scenarios.
- I come across $10$ people, and I want to give them all of my apples, I also want to ensure that each person receives the same number of apples. With $50$ apples to disperse across $10$ people, this means each person receives $5$ apples.
- This is expressed as $\frac{50}{10} = 5$ and is true.
However, let's say I have the same $50$ apples, and I come across a town where no one is hungry, and no one wants my apples. Well, I have $50$ apples, and I have $0$ people to give them to, so I still have $50$ apples. I didn't disperse my apples evenly across any number of people, so it's still the same bag of $50$ apples.
I believe this may be my mind's way of bending the facts here, and that I've convinced myself that I'm dividing $50$ zero times, but in fact I may have divided $50$ one time (by me). But it has me thinking, if I divide a pizza into zero equal slices, well then I essentially didn't slice the pizza and thus still just have an entire pizza.
My Question
How can it be proved thoroughly, not just with math, but with an example explanation (understandable by children) that division by zero is truly undefined?
That division by zero is undefined cannot be proven without math, because it is a mathematical statement. It's like asking "How can you prove that pass interference is a foul without reference to sports?" If you have a definition of "division", then you can ask whether that definition can be applied to zero. For instance, if you define division such that $x\div y$ means "Give the number $z$ such that $y \cdot z =x$", there is no such number in the standard real number system for $y=0$. If we're required to have that $(x\div y) \cdot y=x$, then that doesn't work when $y$ is equal to zero. In computer languages where
x/0returns an object for which multiplication is defined, you do not have that(x\0)*0 == x. So we can can a class of objects in which we call one of the objects "zero", and have a class method such that "division" by "zero" is defined, but that class will not act exactly like the real numbers do.Another definition of division is in terms of repeated subtraction. If you take 50 apples and give one apple each to 10 people, then keep doing that until you run out of apples, each person will end up with 5 apples. You're repeatedly subtracting 10 from 50, and you can do that 5 times. If you try to subtract 0 from 50 until you run out of apples, you'll be doing it an infinite number of times.