I've managed to find myself neck deep in a maths problem I have no idea how to solve. I think it relates to normal distribution and probability but i'm at a loss how to get an answer. I would really appreciate any advice on where to start with this.
I am trying to determine the mean and standard deviation of the total time it takes to update n electronic devices. Given the following facts... 1) Only one device can be updated at a time. 2) Each device checks for an update every t seconds. 3) The update takes u seconds to complete, during which time any device trying to check for updates will be ignored.
So to take you through an example where n=2, t = 10 and u = 1. at t=0 no devices have been updated. I have to wait any time up to 10 seconds for either device 1 or device 2 to check for an update. Lets say device 1 checks for an update, I now have to wait for 1 second for the update to complete. I then wait for up to 10 seconds for device 2 to check for updates. Device 2 takes 1 second to update.
So a general rule would be that the total update time is (u+t)n. The best case scenario would be that t=0 for every device. and therefore the total time would be u*n. And the worst case scenario would be that t=10 for every device which would mean the total time is (u+10)n However for each device t will change based on a changing normal distribution. The larger n is the more likely that a device will check for an update sooner. But as more devices have already been updated the time t will get longer on average.
So is there a way to generalise this and calculate the mean time to update all devices and also the standard deviation to produce a bell curve of the distribution?
Thanks in advance to anyone who can spare time on this. It is very much appreciated!
If you have $n$ devices checking every $t$ seconds the expected waiting time (which should be close to the mean) for the first is $\frac t{n+1}$ seconds. If we model each device as checking exactly every $10$ seconds with a random offset from $00$ the problem comes because as we update devices the distribution in checking times is no longer uniform. If $n$ is very large it seems for most of the process the distribution will still be close to uniform. In that case the expected waiting time is $t(H_{n+1}-1)$ where $H_n$ is the $n$th Harmonic number. This is about $t(\log(n+1)-\frac 12)$ where I took $\gamma \approx 0.5$, which seems accurate enough for this. Add that to the update time of $nu$ and you get $t(\log (n+1)-\frac 12)+nu$. For large $n$ the update time will dominate and you have very little dispersion because you have another ready to update as soon as each one finishes. The long waiting times for the last few devices don't matter any more. For small $n$ where $t \gt u$ you are dominated by waiting for devices to check. It is that regime where the correlations are important, so I would think one would have to do a random model.