This has been bothering me a lot, this is my thinking:
$3\dfrac12 \implies \dfrac72 \implies \dfrac{35}{10}$
similarly $\dfrac45 \implies \dfrac{8}{10}$
So
$$\dfrac{\dfrac{35}{10}}{\dfrac{8}{10}}=\dfrac{35}{8}=4.375$$
but... 35 % 8 = 3 ; 3/10? What's wrong with 4 and 3/10?
Do you ALWAYS just discard the denominator in division, is that where I'm failing?
I don't know if I understand your question, but for any numbers $a$, $b$, and $c$, it is true that $$\frac{a/c}{b/c}=\frac{a}{b}$$ (as long as neither $b$ nor $c$ is zero, so that we're not dividing by zero anywhere).