Application run time estimation as mathematical formula/function

164 Views Asked by At

I'm writing a program that sends out text messages en masse. The program will split up the total number of texts into batches. These batches will be triggered, one at a time, after a set time period, but could technically be running concurrently (i.e. they don't actually wait on the last batch to finish before starting).

The question: How would I represent this in a formula or a function in order to estimate how long this process will take in minutes?

I never took calculus or advanced math, so I'm sure I'm missing something here. Any help is appreciated, but here's what I've got so far :

  • let x be the total number of subscribers receiving texts
  • let y be the frequency (in seconds) at which a batch is triggered
  • let z be the number texts in each batch
  • let t be the estimated time (in seconds) each text takes to send

$$f\left(x,y,z,t\right) = \frac1{60}\left(\lceil \frac xz \rceil y + tz\right)$$

So that:

$f\left(4600,60,100,.5\right) \approx $ 38 minutes

Note: the use of ceiling function $ \lceil \frac xz \rceil $ is because a batch will still be required to run as long as remaining texts $\ge 1$.