Ok so I'm reading a book right now that tackles pre-algebra and I stumbled on this example problem that goes like this.
Jerry bought a pie and ate 1⁄5 of it. Then his wife Doreen ate 1⁄6 of what was left. How much of the total pie was left?
The authors solution is to solve it like this.
First, subtract 1/5 from 1/1 (pie) which equals 4/5, and then multiply 1/6 to 4/5 to get how much Doreen ate from the whole pie, 4/30.
My way obviously is to solve it like this.
First subtract 1/1 by 1/5 (the same with the author), but then I subtract 4/5 with 1/6, which gives me a totally different answer, 19/30.
So my questions are these,
1.) Why does subtracting one fraction to another not work in this scenario. (Because I have a different result)
2.) Obviously (for me) I should subtract based on the question, but why does the author use multiplication instead?
3.) And why does multiplying instead of subtraction work?
Here, I recommend we use variables.
Let $p$ represent the pie that is being eaten (somewhat timidly).
Jerry eats $1/5$ of it, so $1-1/5 = 4/5$ of the original pie is left: $4p/5$.
The his wife comes and eats $1/6$ of what is left of the pie.
Take the remaining pie ($4p/5$), and take away $1/6$ of this new pie stuffs, not the original pie.
$(4p/5)(1-1/6) = 4p/5 \cdot 5/6 = 2p/3$. So two-thirds of the pie remains.
You are subtracting two fractions of the entire pie. The question asks you first about the enitre pie, then the remainder.
'Of' implies multiplication: $10\%\ \text{of}\ 500 = .1 \cdot 500 = 50$.
Multiplication is the only way it will work, because subtraction is the incorrect way to think about the wording of the problem.
Cheers!