Poisson process: server throughput to guarantee 99%+ response rate

55 Views Asked by At

Each client sends an event every second. Server processes an event in t seconds (t < 1 s, e.i. 0.01 s). We want to serve N clients (70000). Server can process up to n simultaneous requests (timeout occurs on exceeding requests).

What should be n so that timeout occurs on less than 1% (or 0.1%) requests?

UPDATE:
Clients are NOT queued up. Denial of Service happens whenever server replies n simultaneous requests and one other client calls a server.
This worst case scenario: n+m clients call server at time t0. The server can only serve n clients at a time, thus m clients will suffer from Denial of Service.

1

There are 1 best solutions below

1
On

In general, such problems are very difficult and, I think, no exact closed-form solution can be suggested.

The description of the problem, that you have provided, is not good because, accroding to me, it leads to the trivial solution.

You say "Each client sends an event every second. We want to serve $N$ clients $(70000)$.". Right now it means to me that you have (at least) $N=70000$ clients, who send events every second. Then you say "Server processes an event in $t<1$ seconds". Thus if need "timeout" signal on less than $1\%$ of requests, you need to have $n$ such that the inequality $n/N>0.99$ holds.