A stone is dropped vertically in a velocity of $15$ minutes per second, from a point $40$ meters above the ground (excuse my poor English.).
a. How long will it take to the stone to hit the ground?
I answered it like this:
$v_0=15$, $a(t)=a_0=g$, $v(t)=v_0+gt$ $\Rightarrow x(t)=x_0+v_0t+{1\over 2}gt^2\Rightarrow x(t)=x_0+15t+gt^2$.
I denoted $x_0=0$, which derived $x(t_1)=40$ given $t_1$ is the time it takes to get to the ground.
Then I solve $x(t_1)=40=15t_1+4.9t_1^2$. I got that $t_1=1.7s$ sec, and the other option is negative.
The problem is: I was then asked: when will the stone be at 2 sec? 3 sec? That makes me feel like I got this all wrong. I am new to this an I am not sure I know what I am doing. I would appreciate your help.
The solution is no more complex. You have to use the equation $y(t)=h_0+v_0t-\frac{1}{2}gt^2$, where $h_0$ is the initial position ($40$m), and $v_0$ the initial velocity ($0$ in this case). Now, solve
$40-4.9t^2=0$
and use the original fórmula for the other questions too.