So I've tried to solve this problem by myself but I'm not sure I'm right. So here it goes:
In baseball, the pitcher's mound is 60.5 feet from home plate. The strike zone, or distance across the plate, is 17 inches. The time it takes for a baseball to reach home plate can be determined by dividing the distance the ball travels by the speed at which the pitcher throws the baseball.
// If a pitcher throws a baseball at 90 miles per hour, how many seconds does it take for the baseball to reach home plate?
This is how I've tried to solve it: Time= Distance/Speed T= 60.5ft / 90miles/hour Since my answer is supposed to be in SECONDS and the distance is in FEET, i converted it into: T= 60.5ft / 475200ft/3600s Here is where I get confused because I don't know if I should divide 60.5ft by 475200ft and THEN divide that by 3600seconds OR: first divide 475200ft by 3600s and then divide 60.5ft by my first answer.
Also, I don't even know if this is the way I should be solving this problem at all. :/
$$ T= {60.5ft \over {475200ft\over3600s}} = {60.5ft \div \left({475200ft\over3600s}\right)}$$
so do the ft/s division first, then divide the ft by the result