Rate vs velocity?

366 Views Asked by At

A bomber releases a bomb while flying vertically upwards at a velocity of 1500 ft/sec at a height of 10,000 feet. a) How long after it is released will it take the bomb to reach the ground. b) Immediately after releasing the bomb the bomber flies away horizontally at the rate of 1200 feet/sec. How far away from the point at which the bomb strikes the ground is the plane when the bomb strikes?

For part B they multiply 1200 by 10 seconds to find how far its gone. However my question is if I see $1200 feet/sec$ as a velocity why can't I integrate it to get $s(t)$ an just put plug in 10 seconds in the equation for $s(t)$

1

There are 1 best solutions below

0
On

For part A, the height above the ground is $h=-16t^2+1500t+10000$. Setting this to zero and solving gives $t=100$ seconds. For part B, the plane is traveling horizontally at constant speed. The distance from the starting point is then $1200t$ feet, so at $t=100$ it is $120,000$ feet away from the release point. You then need to compute the hypotenuse of the right triangle for the final answer