I've got a student in a senior year maths class, who is adamant that it is incorrect to depict it as a hole, even though this is the standard way of depicting it.
I couldn't find any problems with his reasoning. So I've asked him to put his reasoning down in a paper, so someone who knows more than me about this topic can get to the bottom of it. Link below:
http://www.slideshare.net/MrIndererminate/indeterminate-is-not-synonymous-with-undefined
Note that just because all numbers "are equally valid" for the answer of $\frac00$ does not many any of them valid. In fact, an implication of his argument is that all answers are equally invalid.
Additionally, we define operations as functions (this will be important in a minute). Give your student an example from number theory:
Let the successor function be defined as thus:
$$s(n)=n+1$$
And so we define addition $n+k$ as $n$ plugged into the successor function $k$ times.
But there is a definition of a function that becomes very important here. A function, for each unique input, must have a single output (this output can be undefined).
So we can define:
$$\frac xy=d(x,y)= \begin{cases} z\text{ s.t. } zy=x, & \text{if } x,y \neq 0\\ \text{undefined}, & \text{if } x,y = 0\\ \end{cases}$$
Because otherwise, without that second case, we can't define division as a function, and we can't define it as an operation, and I think you can see the algebraic implications of that.
Another example is the square root function - $\sqrt{x}\geq0$ for all $x>0$, but doesn't it make sense that the square root should give us two answers, one for the positive, and one for the negative? No. Because we want operations to be functions. And so, we define the output to be the principal square root, the one that, if you plot it on the complex plane with polar coordinates, has the smallest $\theta$ value. When dealing with real numbers, this is always the positive number.
And that, in a nutshell, is why $\frac00$ is undefined. Tell your student that following this path of reasoning - i.e. holding to a point without understanding its implications - will make it impossible for him to get anywhere in the field of math - abstract algebra, real analysis, they'd all fall apart with his defintion. We wouldn't even have the rational numbers well defined. Mathematicians aren't idiots - we have our reasons.
In one sense, I'm being arrogant by calling myself a mathematician. But in another sense, anyone who pursues math voluntarily is a mathematician, and that questioning is exactly what makes math awesome.
But he does probably need to stop spouting off "revolutionary" theories about basic math facts.