Is there a way to prove that the distance traveled to reach certain velocity will be monotonically increase with the decreasing force applied?

19 Views Asked by At

Suppose you have a changing force applied to an object, F1(t). The distance the object will be traveling when it reaches a speed VT is denoted as L1. It is easy to establish the relationship between F1(t) and L1.

Now suppose you have a smaller force is being applied to the another object with the same mass. i.e. F2(t)<F1(t) for all $t>0$. We define L2 as the distance it will be traveling when it reaches speed VT.

Now my questions is, is there a way to prove tha;t L1<L2? Intuitively it seem correct, and it is easy to prove the case when F1(t) and F2(t) are constant. But I was having trouble prove the general case when you do not have any extra information toward F1(t) and F2(t).

1

There are 1 best solutions below

0
On BEST ANSWER

No, this is not necessarily true when $F_1$ and $F_2$ can vary in time. I will neglect units in my examples and take mass $= 1$.

For instance, define $F_1(t) = \begin{cases} 1 & t < 1 \\ 5 & t \geq 1 \end{cases}$, and define $F_2(t) = \begin{cases} 0 & t < 1 \\ 4 & t \geq 1 \end{cases}$. Let $V_T = 1$. Then under $F_1$, we see that the particle accelerates to $V_T$ during the time interval $[0, 1]$, with all displacement occurring there under a force of $1$. However, under $F_2$, all displacement will occur during the time interval $[1, \infty)$ under a force of $4$.