The mean time spent in an $M/M/1/\infty$ queue would be $\frac{1/\mu}{1-\rho}$, where $\rho = \lambda/\mu$ if I am not mistaken.
If the queue throws away incoming packets with probability $p=0.5$ how does the computation for the average time accepted packets spend in the system change?
In this case you will have a Poisson process with rate $p\lambda$. And you can use the same expression for the performance measures.