Expectation and waiting time

1.6k Views Asked by At

There are three jobs that need to be processed, with the processing time of job $i$ being exponential with rate $\mu_i$. There are two processors available, so processing on two of the jobs can immediately start, with processing on the final job to start when one of the initial ones is finished.

$(a)$ Let $T_i$ denote the time at which the processing of job $i$ is completed. If the objective is to minimize $\mathbb E[T_1 + T_2 + T_3]$ , which jobs should be initially processed if $\mu_1 < \mu_2 < \mu_3$?

Intuitively believe that the right would be to start by processes that require more average processing time $2,3$

But I can not understand mathematically where it comes from. Suppose that I don't know that $u_1<u_2<u_3$ $$\mathbb E[T_1+T_2+T_3]=\mathbb E[T_1]+\mathbb E[T_2]+\mathbb E[T_3]=\frac{1}{\mu_1}+\frac{1}{\mu_2}+\frac{1}{\mu_3}$$ I do not really see the difference in the end processing time, starting with any of the processes

2

There are 2 best solutions below

3
On BEST ANSWER

This is Chapter 5, Exercise 16 of Introduction to Probability Models by Sheldon Ross ($10^{\mathrm{th}}$) edition. I am not sure that the solution given is actually correct:

Solution

Let $X_n\sim\operatorname{Exp}(\mu_n)$. If we suppose jobs $i$ and $j$ are initially begun, then the expected time until job $k$ starts is $$\mathbb E[X_i \wedge X_j] = \frac1{\mu_i+\mu_j}. $$ The remaining time, conditioned on $X_i<X_j$, is $X_j \vee X_k+ X_j\wedge X_k$, and as $$X_j+X_k \stackrel d= X_j\vee X_k+X_j\wedge X_k,$$ we have $$\mathbb E[X_j\vee X_k+X_j\wedge X_k] = \mathbb E[X_j+X_k] = \frac1{\mu_j}+\frac1{\mu_k}. $$ Similarly, the expected remaining time, conditioned on $X_j<X_i$ is $\frac1{\mu_i}+\frac1{\mu_k}$. As $\mathbb P(X_i<X_j)=\frac{\mu_i}{\mu_i+\mu_j}$ and $\mathbb P(X_j<X_i)=\frac{\mu_j}{\mu_i+\mu_j}$, the total expected time is \begin{align} &\mathbb E[X_i\wedge X_j] + \mathbb E[X_j+X_k]\mathbb P(X_i<X_j) + \mathbb E[X_i+X_k]\mathbb P(X_j<X_i)\\ &= \frac1{\mu_i+\mu_j} + \left(\frac1{\mu_j}+\frac1{\mu_k}\right)\left(\frac{\mu_i}{\mu_i+\mu_j}\right) +\left(\frac1{\mu_i}+\frac1{\mu_k}\right)\left(\frac{\mu_j}{\mu_i+\mu_j}\right)\\ &= \frac1{\mu_i+\mu_j} +\frac1{\mu_k}+\frac{\mu_i^2+\mu_j^2}{\mu_i\mu_j(\mu_i+\mu_j)}\\ &= \frac1{\mu_k} + \frac{\mu_i^2 + \mu_i\mu_j + \mu_j^2}{\mu_i\mu_j(\mu_i+\mu_j)}\\ &= \frac1{\mu_k}+ \frac{(\mu_i + \mu_j)^2-\mu_i\mu_j}{\mu_i\mu_j(\mu_i+\mu_j)}\\ &= \frac1{\mu_k}+ \frac{\mu_i+\mu_j}{\mu_i\mu_j} - \frac{\mu_i\mu_j}{\mu_i+\mu_j}\\ &= \frac1{\mu_i}+ \frac1{\mu_j}+\frac1{\mu_k} - \frac{\mu_i\mu_j}{\mu_i+\mu_j}. \end{align} If there were only one server, then the expected time would be $$\frac1{\mu_1}+\frac1{\mu_2}+\frac1{\mu_3} < \frac1{\mu_i}+\frac1{\mu_j} + \frac1{\mu_i+\mu_j} +\frac1{\mu_k}, $$ and it's clear that removing a server cannot reduce the expected waiting time.

0
On

This is not a full solution. I simply try to explain how the order of jobs starting affects the min you are after:

if jobs $1$ and $2$ start first the the probability that job $1$ finishes before job $2$ is $$p_{12}=\int_0^{\infty}\int_0^{t_2}\mu_1\mu_2e^{-\mu_1t_1-\mu_2t_2}dt_1dt_2=\frac{\mu_1}{\mu_1+\mu_2}$$ therefore job $3$ may start with probability $p_{12}$ after job $1$ finishes and with probability $1-p_{12}$ after job $2$ finishes. Hence in this case $$E(T_1+T_2+T_3)=p_{12}E\Big[\max\{T_3,T_2\}\Big]+(1-p_{12})E\Big[\max\{T_3,T_1\}\Big]$$