Write a computer programme that by means of stochastic simulation finds an approximation of the variance of a typical waiting time W(q) (in the queue) before service for a typical customer arriving to a steady-state M(1)/M(2)/1/2 queuing system. (In other words, the queuing system has exp(1)-distributed times between arrivals of new customers and exp(2)-distributed service times. Further, the system has one server and one queuing place.)
attempt at solution:
N = 100000
wait = vector(length=N)
for (i in 1:N) {
t1 = rexp(1,1) # arrival times
s1 = rexp(1,2) # service times
if (s1<t1) {
wait[i] = 0
} else {
wait[i] = s1
}
}
VarWait = var(wait)
cat("Variance of a typical waiting time W(q) = ", VarWait, "\n")
I get 0.2704 but the answer should be 0.1413
This should be super simple but im stuck... Can anyone spot my mistake?
I read the logic to be that at the start of each loop, we are in the state where the queue is empty and a customer is being serviced. Then $s1$ is his service time and $t1$ the time till the next customer arrival.
If $s1\lt t1$ then the time to wait for the next customer will be $0$ - so you have that right.
If $t1\lt s1$ then we have a new customer who has to wait. You have him waiting for time $s1$ but I think that's wrong. He begins waiting from the time he arrives, so we have to draw a new random service time, say $s2,$ and that will be his wait time; that is, the remaining service time of the previous customer.