I know this is some simple thing, and I used to know it, but I just can't recall it.
I'm writing an application that takes screenshots of videos in a user's library. If a video is 35 minutes long, I want to take five screenshots, which are evenly spaced from each other and from the beginning and end.
My best guess so far is to say that interval = length / (num_screenshots + 2), where length is the video's total length as a number of seconds. Then I take a screenshot at 0 + (interval * 1), 0 + (interval * 2), etc up to 5.
But is that the correct/simplest way to calculate this? It feels like there's some obvious thing I'm missing.
For n evenly spaced screenshots, there are n + 1 total intervals, so the formula for each interval should be
interval = length / (num_screenshots + 1). The first mark will be at(interval * 1), the second at(interval * 2), and ending in(interval * n). I don't see a much simpler way to due this, it seems quite simple to me.Interval Visualization