I'm trying to prove that the expected total time spent in an $M/M/n$ queue is less than the expected total time spent in a system with $n$ independent $M/M/1$ queues.
Let $\lambda$ be the arrival rate of customers to both systems (i.e. customers arrive with rate $\frac{\lambda}{n}$ to each independent $M/M/1$ queue) and let the service rate of each server be $\mu > \frac{\lambda}{n}$.
I can't seem to prove that the expectedtotal time spent in an $M/M/n$ queue is less than the expected total time spent in a system with $n$ independent $M/M/1$ queues.
Can anyone help me do this?
You can do it by a pairing argument.
If there are $k$ jobs in the system, then the M/M/$n$ queue will process them at a rate of $\min\{k,n\}\mu$, whereas the system with $n$ independent M/M/$1$ queues will process them at a rate of $j\mu$, where $j \le \min\{k,n\}$ is the number of active queues.
So, no matter how many jobs are in the system (and how they are distributed between the M/M/$1$ queues) the system with independent queues will process them at most at the same rate as the M/M/$n$ queue, and sometimes slower.
(For example, if $2$ jobs arrive into the same queue, the system with independent queues will only process them at a rate of $\mu$, while the M/M/$n$ queue would process them at a rate of $2\mu$.)