A jet flies to the west for a distance of approximately 830 miles, starting at Point A and ending at Point B. The jet is moving at 400 mph, on average. A strong wind comes from the north at 40 mph. In minutes, how long would it take for the jet to reach its final destination at Point B?
I'm not quite sure why I'm having issues with this question. I think it might have to do with the part with the wind, unless it's as deceptively simple as doing synthetic division with 400/830?
The plane has to fly in a direction north of west so the wind plus its velocity adds to due westward motion. If you draw a picture you should have a right triangle created by the wind, the plane air speed vector, and the plane ground speed vector. A little trig will see you home.