How to calculate mean of slowdown

337 Views Asked by At

Consider the system that jobs arrive at a server that services them in FCFS order. Job sizes (service time) are independently and identically distributed according to a random variable $S$. We define the slowdown of job i as $slowdown(i) = \frac{T(i)}{S(i)}$, where $T(i)$ is the response time of job i and $S(i)$ is the service time of job i.

  1. We want to compute the mean slowdown.

    I consider the problem in this way.

    Since $T(i) = T_q(i) + S(i)$, where $T_q(i)$ is the time spending in the queue and I think it is independent from service time. Hence $slowdown(i) = \frac{T(i)}{S(i)} = \frac{T_q(i)}{S(i)} + 1$ and thus $E[slowdown] = E[T_q] E[\frac{1}{S}] + 1$.

    But I am not sure about my idea.

  2. Another question is that if the service order had been Shortest-Job-First, how can we compute mean slowdown?

Can anyone help me with these two questions?

By the way, this is a problem from chapter 2 of "Performance modeling and design of computer system" and I am studying this book by myself.