I came across the following question.
A man travels a distance of $20$ miles at $60$ miles/hr and then return over the same route at $40$ miles/hr. What is the average rate for the round trip in miles per hour
The way I see it is $(60+40)/2 = 50$ miles/hr. However, the book says the answer is $48$. Am I missing something here?
A clear explanation would be appreciate on how the book came up with $48$ instead of $50$?
To see that your reasoning can’t be right, consider the extreme case in which he makes the outbound trip at $20$ mph and the return trip at $0$ mph. By your reasoning his average speed for the round trip would be $10$ mph, even though he never actually completes it! In fact you can see that in that case he takes an hour for the first leg and never completes the second at all. Had he actually averaged $10$ mph, he’d have covered the $20+20=40$ miles in $4$ hours, so the return trip would have taken $3$ hours, and his average speed on that leg would have been $20/3$, or $6$-$2/3$, mph.
The reason you can’t simply average the speeds is that since he covers the same distance in each direction, he spends more time travelling at the slower speed. Thus, it weighs more heavily in determining his overall average speed.