Inequality between Independent Erlang random variables

670 Views Asked by At

I have two Independent erlang random variables as follows,

$$X \sim {\rm Erl}(n, \lambda) \text{ and } Y \sim {\rm Erl}(m, \mu)$$

Here $\lambda$ is the rate parameter ie, $f_X(x) = \lambda^n \frac {e^{-\lambda x } x^{n-1}} {(n-1)!}$

I want to calculate $F (n, m) = \mathbb P(X < Y ).$

I tried to do the following,

$$\int \mathbb P(X < Y \mid X=x ) \cdot f_X(x) \, dx$$

which on simplification is giving me

$$\sum_{r=0}^{m-1} {n+r-1 \choose r} \left(\frac{\lambda}{\lambda+\mu}\right)^n \left(\frac{\mu}{\lambda+\mu}\right)^r.$$

But the answer given is $$\sum_{k=n}^{n+m-1} {n+m-1 \choose k} \left(\frac{\lambda}{\lambda+\mu}\right)^k \left(\frac{\mu}{\lambda+\mu}\right)^{n+m-k-1}.$$

Can someone help me here?

2

There are 2 best solutions below

1
On BEST ANSWER

Note that the Erlang PDF comes from the Poisson process. In fact, $X\sim \mathrm{Erl}(n, \lambda)$ means that $X$ is the $n$-th arrival time in the Poisson process of rate $\lambda$. This means that the number of arrivals in a given time interval $[0,x]$ is Poisson distributed with parameter $x\lambda$.

Using $Y\sim \mathrm{Erl}(m,\mu)$, we have for $x>0$, $$ P(x<Y)=\sum_{k=0}^{m-1} \frac{e^{-x\mu}(x\mu)^k}{k!}. $$ Then $$ \begin{align} P(X<Y)&=\int_0^{\infty} P(x<Y) \lambda^n \frac {e^{-\lambda x } x^{n-1}} {(n-1)!} dx\\ &=\int_0^\infty \sum_{k=0}^{m-1} \frac{e^{-x\mu}(x\mu)^k}{k!} \lambda^n \frac {e^{-\lambda x } x^{n-1}} {(n-1)!}dx \\ &=\lambda^n \sum_{k=0}^{m-1} \binom{k+n-1}{n-1}\mu^k\int_0^{\infty} \frac{e^{-(\lambda+\mu)x}x^{k+n-1}}{(k+n-1)!} dx \\ &=\lambda^n \sum_{k=0}^{m-1} \binom{k+n-1}{n-1}\mu^k \frac1{(\lambda+\mu)^{k+n}}\\ &=\sum_{k=0}^{m-1} \binom{k+n-1}{n-1} \left(\frac{\lambda}{\lambda+\mu}\right)^n\left(\frac{\mu}{\lambda+\mu}\right)^k. \ \ \ (*) \end{align} $$ This expression seems different, but this is equivalent to: $$ \sum_{k=n}^{n+m-1} {n+m-1 \choose k} \left(\frac{\lambda}{\lambda+\mu}\right)^k \left(\frac{\mu}{\lambda+\mu}\right)^{n+m-k-1}. \ \ \ \ (**)$$

Let $T\sim NB(n, \frac{\lambda}{\lambda+\mu})$. This is the time of $n$-th success, with probability of success in each trial is $\frac{\lambda}{\lambda+\mu}$. Let $S\sim B(n+m-1,\frac{\lambda}{\lambda+\mu})$. This is the number of successes in $n+m-1$ trials, with probability of success in each trial is $\frac{\lambda}{\lambda+\mu}$. Note that $$ n\leq T\leq n+m-1 \Longleftrightarrow n\leq S\leq n+m-1. $$ The formula $(*)$ is the expression for $P(n\leq T\leq n+m-1)$ and the second formula $(**)$ is $P(n\leq S\leq n+m-1)$. Thus, we must have $$ P(n\leq T\leq n+m-1) = P(n\leq S\leq n+m-1). $$ This shows that $(*)$ and $(**)$ are equivalent.

A special case $n=m=1$ is easier to calculate: In this case $X \sim \mathrm{Exp}(\lambda)$ and $Y\sim \mathrm{Exp}(\mu)$. Then $$ \begin{align} P(X<Y)&=\int_0^\infty P(x<Y) \lambda e^{-\lambda x} dx\\ &=\lambda \int_0^{\infty} e^{-(\lambda+\mu)x} dx = \frac{\lambda}{\lambda+\mu}. \end{align} $$ There is a reasoning that leads directly to one of the binomial or negative binomial distributions.

Let $\{A_t\}$ and $\{B_t\}$ be Poisson processes with rates $\lambda$ and $\mu$ respectively. Then we can consider the arrivals from $\{A_t\}$ as success and the arrivals from $\{B_t\}$ as failures. The probability of success is $P(X<Y)=\frac{\lambda}{\lambda+\mu}$. The event that $n$-th arrival time from $\{A_t\}$ is less than $m$-th arrival time from $\{B_t\}$ can be described as

The $n$-th success happens before the $m$-th failure. $\ \ \ \ \rm (I)$

The formulas $(*)$ and $(**)$ both represent the probability of the event $ \rm(I)$.

2
On

Here's a hasty stab at it, which probably has some details wrong. An essential step is the binomial theorem. Note that one can never pull out of a sum a factor that depends on the index of summation, nor pull out of an integral a factor that depends on the variable with respect to which one integrates. And it should be clear from the start that the bottom line should depend on $\mu$ and $\lambda$ only through the ratio $\mu/\lambda$, as does in fact happend below and also in the answers mentioned in the question.

\begin{align} \Pr(X<Y) & = \operatorname{E}(\Pr(X<Y \mid X)) \\[10pt] & = \int_0^\infty \Pr(X<Y\mid X=x) f_X(x)\, dx \\[10pt] & = \int_0^\infty \left( \int_x^\infty f_Y(y)\,dy \right) f_X(x) \, dx \\[10pt] & = \int_0^\infty \left( \int_x^\infty \frac 1 {(m-1)!} (\mu y)^{m-1} e^{-\mu y} (\mu\, dy) \right) \frac 1 {(n-1)!} (\lambda x)^{n-1} e^{-\lambda x} (\lambda \, dx) \\[10pt] & = \frac 1 {(m-1)!(n-1)!} \int_0^\infty \left( \int_{\mu v/\lambda}^\infty u^{m-1} e^{-u}\,du \right) v^{n-1} e^{-v} \, dv \\[10pt] & = \frac 1 {(m-1)!(n-1)!} \int_0^\infty \left( \int_0^\infty \left( w + \frac{\mu v} \lambda \right)^{m-1} e^{-(w + (\mu v/\lambda))} \, dw \right) v^{n-1} e^{-v} \, dv \\[10pt] & = \frac 1 {(m-1)!(n-1)!} \int_0^\infty \left( \int_0^\infty e^{-\mu v/\lambda} \sum_{k=0}^{m-1} \binom{m-1} k w^k \left( \frac{\mu v} \lambda \right)^{m-1-k} e^{-w} \, dw \right) v^{n-1} e^{-v} \, dv \\[10pt] & = \frac 1 {(m-1)!(n-1)!} \int_0^\infty \left( e^{-\mu v/\lambda} \sum_{k=0}^{m-1} \binom{m-1} k \left( \frac{\mu v} \lambda \right)^{m-1-k} \int_0^\infty w^k e^{-w}\,dw \right) v^{n-1} e^{-v}\,dv \\[10pt] & = \frac 1 {(m-1)!(n-1)!} \int_0^\infty \left( e^{-\mu v/\lambda} \sum_{k=0}^{m-1} \binom{m-1} k \left( \frac{\mu v} \lambda \right)^{m-1-k} k! \right) v^{n-1} e^{-v}\,dv \\[10pt] & = \frac 1 {(n-1)!} \sum_{k=0}^{m-1} \frac {(\mu/\lambda)^{m-1-k}} {(m-1-k)!} \int_0^\infty v^{m+n-k-2} e^{-v(1+ \mu/\lambda)} \, dv \\[10pt] & = \frac 1 {(n-1)!} \sum_{k=0}^{m-1} \frac {(\mu/\lambda)^{m-1-k}} {(m-1-k)!} \frac{(m+n-k-2)!}{\left( 1 + \frac \mu \lambda \right)^{m+n-k-1}} \\[10pt] & = \frac 1 {(n-1)!} \sum_{\ell=0}^{m-1} \frac {(\mu/\lambda)^\ell} {\ell!} \frac{(\ell+n-1)!}{\left( 1 + \frac \mu \lambda \right)^{\ell+n}} \qquad (\text{where } \ell = m-1-k) \\[10pt] & = \left( \frac \lambda {\lambda + \mu} \right)^n\ \sum_{\ell=0}^{m-1} \frac{\left( \frac\mu{\lambda+\mu} \right)^\ell}{\ell!} \end{align}