I recently got this peculiar interview question, and I wanted some help figuring out how to reach an appropriate solution. Imagine that we have a race car that is driving on a $50$-mile-long race track, and this race car has five minutes to drive along this race track. Suppose that I went 20 miles per hour on the first half of the race track. How fast do I need to go on the second half of the race track such that I average 40 miles per hour over the whole drive on the track?
I immediately went for the idea that the answer was 60 miles per hour, but supposedly that was wrong. I think I needed to better consider the fact that miles per hour is a measure of distance over time. So $$40 \text{mph} = \frac{40 \text{ miles}}{60 \text{ minutes}},$$ But I am now stuck on how to use this information to deduce how many minutes I need to take on the second half to average this speed. Any suggestions?
We have $v_1 = 20, d_1 = 25, d_2 = 25$
We want $40 = \dfrac{d_1+d_2}{d_1/v_1+d_2/v_2} = \dfrac{50}{25/20+25/v_2}$ (that is: total distance divided by total time).
Hence we need indefinite speed. $v_2=\dfrac{25}{50/40-25/20} = \dfrac{25}{0}$