I'm stumped on this question:
Two planes flying at the same altitude are heading toward an airport. The paths of these planes form a right triangle. One is flying due east toward the airport at 450 mph. The other is approaching the airport from the south at 275 mph. When each is 100 miles from the airport how fast is the distance between the planes changing?
I used the pythagorean theorem ($2xx' + 2yy' = 2dd'$) to calculate the rate at which the distance is changing, and I got:
$2(100)(450) + 2(100)(275) = 2(√(100^2 + 100^2))(d')$
And plugging the above in a calculator, I got 512.652 miles/hr. But this is wrong, and I don't know why. Can someone please help? Thanks!
There’s a sign issue. The distance is decreasing, since both $x,x’$ and $y,y’$ have opposite signs.