I have a real world problem that boils down to this:
- I have $S$ sensors that each send a message once every two minutes to the server. Each sensor sends its data at a time uniformly at random within the two minute period, and is independent of all other sensors.
- The server can only receive $R$ messages per second (if more are sent errors result)
How large must $R$ be to ensure that no more than $R$ messages are received within one second 99.995% of the time?
Equivalently, if I roll $S$ dice, each with 120 different faces, how large must $R$ be so that 99.995% of the time no more than $R$ of any given face comes up?
Your description suggests the use of the Poisson distribution:
$$P(k \text{ events in interval}) = e^{-\lambda}\frac{\lambda^k}{k!}$$
In this case, the interval is $1$ second, and $\lambda=\frac{S}{120}$.
You want to find the smallest $R$ such that $\sum_{k=0}^RP(k)>0.99995$, that is, with an exception on average once every $5\tfrac12$ hours.
Now, the cumulative distribution function is $P(k\le R) = P(k<R+1)=Q(R+1,\lambda)$ where $Q$ is the regularized gamma function.
Suppose there are $S=80$ sensors. Then by WolframAlpha:
GammaRegularized[6,80/120]$\approx0.9999309$GammaRegularized[7,80/120]$\approx0.9999935$So $R=6$, which is to say that fewer than $7$ messages will be received within one second $99.99935\%$ of the time, with an exception once every $43$ hours or so.