Expression with Different Units

52 Views Asked by At

So I've tried to solve this problem by myself but I'm not sure I'm right. So here it goes:

In baseball, the pitcher's mound is 60.5 feet from home plate. The strike zone, or distance across the plate, is 17 inches. The time it takes for a baseball to reach home plate can be determined by dividing the distance the ball travels by the speed at which the pitcher throws the baseball.

// If a pitcher throws a baseball at 90 miles per hour, how many seconds does it take for the baseball to reach home plate?

This is how I've tried to solve it: Time= Distance/Speed T= 60.5ft / 90miles/hour Since my answer is supposed to be in SECONDS and the distance is in FEET, i converted it into: T= 60.5ft / 475200ft/3600s Here is where I get confused because I don't know if I should divide 60.5ft by 475200ft and THEN divide that by 3600seconds OR: first divide 475200ft by 3600s and then divide 60.5ft by my first answer.

Also, I don't even know if this is the way I should be solving this problem at all. :/

2

There are 2 best solutions below

0
On BEST ANSWER

$$ T= {60.5ft \over {475200ft\over3600s}} = {60.5ft \div \left({475200ft\over3600s}\right)}$$

so do the ft/s division first, then divide the ft by the result

0
On

Ok. So $1$ mile = $5280$ ft. Thus, in miles, the distance to the plate is $\frac{60.5}{5280}=0.011458\dot{3}=0.0115$miles.

$90$ miles per hour is equivalent to $90/60=1.5$ miles per minute or $1.5/60=0.025$ miles per second.

Thus, the time it takes the ball to reach the plate is $0.0115/0.025=0.46$ s