Average queue length for M/M/c queue with time-dependent arrival rates

94 Views Asked by At

I'm trying to model the queueing behavior of a system with time-dependent arrival rates. For each interval $ t $, I have different values for the arrival rate $ \lambda(t) $ and for the number of available servers $ c(t) $, while the service rates mean $ \mu $ is constant over time. I would like to obtain an approximation of the average queue length for each time interval.

I've tried with the stationary approach, resolving an M/M/c model for each interval, but there are situations in which the utilization factor $ \rho = \frac{\lambda(t)}{\mu(t) c} $ exceeds $ 1$ for a couple of intervals and so the queue starts to grow, but eventually it returns to a stable $ \rho < 1 $. The stationary approach does not work in this cases since every interval is considered independently from the others.

Is there some feasible approximations of the average queue length for this type of model?