Coefficient of Variation in service time of M/G/1 queue

896 Views Asked by At

The Coefficient of Variation (CV) $c$ is modelled as $c^2=c_0^2+A(1-A)m_r/t_0$, where the $M/G/1$ server is partially available with a mean service time $t_0$. Availability of the server is measured as $A=\frac{m_f}{m_f+m_r}$ where $m_f$ and $m_r$ denotes the mean time to failure and repair, respectively.

Average cycle time of the queue is $(\frac{1+c^2}{2})(\frac{u}{1-u})t_0$, where $u$ denotes the server utilization.

My doubt is: When the availability of the server is zero, the corresponding throughput time should be extremely high. However, when I plot a graph of A vs Cycle time, I get an inverted-U shaped graph which is symmetrical. The graph speaks that the cycle time is minimum for both $A=0$ and $A=1$. It just doesn't make good sense of the given situation.

Am I doing anything wrong? I obtained this model from Factory Physics book, Chapter 8: Variability Basics.

1

There are 1 best solutions below

0
On

For what it's worth, I substituted in everything in the expression for the average cycle time to get $$ \frac{\lambda(\sigma^2+(m_f^2+m_r^2+m_fm_r(2+m_r)))t_0}{2(m_f+m_r)^2}, $$ where $\lambda$ is the arrival rate and $\sigma^2$ the variance of the service times. I believe this can be modeled as an $M/G/1$ server with vacations - you may wish to check the literature on that subject.