How to calculate duration of event at different speeds

13.2k Views Asked by At

Specifically I want to figure out the formula which will tell me: how long will it take to watch this video (normal length $L$) at speed $x$.

I think this will be asymptotic, no matter how fast you play a video, it will take some time, even as that dwindles down towards $0$, it will never hit. I also assume that playing a video at speed $x=2$ would mean $\text{duration} = \frac{L}{2}$, but I'm unclear how to translate that into a general equation. For instance, when $x=1.4$.

A brief explanation of the method of calculating this would be best.

2

There are 2 best solutions below

1
On BEST ANSWER

The playback rate is the number of seconds of video per second of real time, i.e. $R = V/T$. Rearranging that, you get $T = V/R$, i.e. the amount of real time is equal to the length of video divided by the rate. So a 4 minute video at 1.5 rate takes $4/1.5=2.67$ minutes, or 2 minutes 40 seconds.

2
On

The basic relation is $$ s = v t $$ where e.g. $v$ is tape speed and $s$ is tape length.

If you go from $v$ to $v' = x v$ the length $s$ stays the same, but the time has to change to $t'$: $$ s = v t = v' t' = x v t' $$ so $$ t' = v t / (x v) = t / x $$ So if you double the speed by $x=2$, you half the viewing time to $t' = t/2$. In general this is $$ t' = t / x $$