Queuing Theory: Delay of a Message on a Link

48 Views Asked by At

I'm researching on Load Balancing in networks and using queueing theory for modeling, but I cannot understand the reason for some calculations. Some papers and books mention that on the condition that packets arrive at a link at a Poisson rate and the packet lengths are exponentially distributed, the mean delay of a message on a link can be calculated by using M/M/1 queue formulas which means:

$$Delay =\frac {\text{MessageLength (bits)}}{\text{LinkCapacity(bits/sec)}} * \frac{1}{1-ρ}$$

while

$$\rho = \frac {\text{PacketArrivalRate (packet/sec)}}{\frac{\text{LinkCapacity(bits/sec)}}{\text{MessageLength (bits)}}} $$

My first question is why should we use the above way to calculate the delay instead of using well-known $\frac{1}{μ-λ}$ which in this case is

$$\frac{1}{\text{LinkCapacity(bits/sec)}-\text{PacketArrivalRate (packet/sec)}\cdot \text{MessageLength (bits)}}?$$

My second question is why service time or $\frac{1}{\mu}$ is

$$\frac{\text{MessageLength (bits)}} {\text{LinkCapacity(bits/sec)}}?$$