This is a homework question:
The skydome in Toronto has a center field fence that is 10 feet high and 400 feet from home plate. A ball is hit 3 feet above the ground with an initial velocity of 100 miles per hour.
a) The ball leaves the bat at an angle of $\theta = \theta_0$ with the horizontal. Write a vector valued function for the path of the ball.
My answer after converting the miles per hour to feet/sec: $$r(t) = \langle 146.7\cos\theta_0t, -16t^2 + 146.7\sin\theta_0t + 3\rangle$$
b) ... Draw some graphs and determine the minimum angle required for a home run.
c) Determine analytically the minimum angle required for the hit to be a home run.
My answer: I set $t = x / 146.7\cos\theta$, replaced t in the y component of the above vector, then solved for theta (Making the angle $\theta$ a function of x and y), but when I plug in $x = 400$ and $y = 10$ I get an unrealistic answer...
EDIT: Here is how I solved for $\theta$:
Solve for $t$ using the x-component of the vector and plug the result into the y component:
$$y = -16(x/146.7\cos\theta)^2 + 146.7\sin\theta(x/146.7\cos\theta) + 3$$
Here I actually solved for any x or y, but it's easier to plug them in now:
$10 = -119\sec^2\theta + 400\tan\theta + 3$
$7 = -119\tan^2\theta - 119 + 400\tan\theta$
$-0.06 = \tan^2\theta - 3.36\tan\theta$
$2.76 = (\tan\theta - 1.68)^2$
$3.34 = \tan\theta$
$\tan^{-1}3.34 = 73$
73 is obviously too high.
Do I have the correct position function and how can I analytically determine the minimum angle required for a home run?
$\sec^2\theta = \tan^2\theta + 1$, not $\tan^2\theta-1$.