A ball is shot at a velocity of $10.0\ m/s$ at $40.0^\circ$ above the horizontal. How far away does it land?
I know that the horizontal displacement equals time*horizontal velocity where here the horizontal velocity is $8\ m/s$. I don't know how to find the time.
Use Components!
$sin(40) (10)=6.43$
now that is the vertical component of velocity.
$d=v_i t +\frac{1}{2}a t^2$
when the ball lands, $d=0$
$0=6.435t-9.81 (0.5)(t^2)$ ($a$ is our acceleration and is equal to gravity)
which gives $t=0$ or $1.31$.
Ignore $t=0$ as that is when we are launching are projectile and take the other time.