$3 \frac12$ divided by $\frac45\,$ ; why do I get 4.3?

523 Views Asked by At

This has been bothering me a lot, this is my thinking:

$3\dfrac12 \implies \dfrac72 \implies \dfrac{35}{10}$

similarly $\dfrac45 \implies \dfrac{8}{10}$

So

$$\dfrac{\dfrac{35}{10}}{\dfrac{8}{10}}=\dfrac{35}{8}=4.375$$

but... 35 % 8 = 3 ; 3/10? What's wrong with 4 and 3/10?

Do you ALWAYS just discard the denominator in division, is that where I'm failing?

3

There are 3 best solutions below

1
On

I don't know if I understand your question, but for any numbers $a$, $b$, and $c$, it is true that $$\frac{a/c}{b/c}=\frac{a}{b}$$ (as long as neither $b$ nor $c$ is zero, so that we're not dividing by zero anywhere).

1
On

35 div 8 is 4, 35 mod 8 = 3, so the answer is $4 + 3/8$ (not $4 + 3/10$).

0
On

$$ 3.5=\frac{7}{2}\Longrightarrow \frac{7}{2}\div\frac{4}{5}=\frac{7}{2}\cdot\frac{5}{4}=\frac{35}{8}= 4\small{\frac{3}{8}}$$