In class, my teacher showed me this example:
8 / 10 in long division.
She said: "Take 8 and divide by 10; you can't do it because 8 < 10 and 8 is not in 10's table; so add a decimal point, add 0 and then divide. You will have to add any digits you get while dividing from now after the decimal point."
So I'm wondering why should we add a decimal point?
Since $8<10$ that implies $0<8/10<1$. So the quotient must be a non-integer and requires a decimal.
edit: by definition of division, $a/b$ means there exists some $x\in \mathbb R$ such that $bx=a$. Hence "how many times does b go into a?" If no such integer $x$ exists, then the solution is a non-integer even when $a>b$. With $83/10$ there is no integer $x$ where $10x=83$. The smallest multiple to 83 is $x=8$ and the nearest integer after 8 would be $x=9$. Therefore $8<x<9$. But 8, 9 are consecutives integers, so there cannot be another integer between them. Hence, $x$ cannot be an integer