I have two Independent erlang random variables as follows,
$$X \sim {\rm Erl}(n, \lambda) \text{ and } Y \sim {\rm Erl}(m, \mu)$$
Here $\lambda$ is the rate parameter ie, $f_X(x) = \lambda^n \frac {e^{-\lambda x } x^{n-1}} {(n-1)!}$
I want to calculate $F (n, m) = \mathbb P(X < Y ).$
I tried to do the following,
$$\int \mathbb P(X < Y \mid X=x ) \cdot f_X(x) \, dx$$
which on simplification is giving me
$$\sum_{r=0}^{m-1} {n+r-1 \choose r} \left(\frac{\lambda}{\lambda+\mu}\right)^n \left(\frac{\mu}{\lambda+\mu}\right)^r.$$
But the answer given is $$\sum_{k=n}^{n+m-1} {n+m-1 \choose k} \left(\frac{\lambda}{\lambda+\mu}\right)^k \left(\frac{\mu}{\lambda+\mu}\right)^{n+m-k-1}.$$
Can someone help me here?
Note that the Erlang PDF comes from the Poisson process. In fact, $X\sim \mathrm{Erl}(n, \lambda)$ means that $X$ is the $n$-th arrival time in the Poisson process of rate $\lambda$. This means that the number of arrivals in a given time interval $[0,x]$ is Poisson distributed with parameter $x\lambda$.
Using $Y\sim \mathrm{Erl}(m,\mu)$, we have for $x>0$, $$ P(x<Y)=\sum_{k=0}^{m-1} \frac{e^{-x\mu}(x\mu)^k}{k!}. $$ Then $$ \begin{align} P(X<Y)&=\int_0^{\infty} P(x<Y) \lambda^n \frac {e^{-\lambda x } x^{n-1}} {(n-1)!} dx\\ &=\int_0^\infty \sum_{k=0}^{m-1} \frac{e^{-x\mu}(x\mu)^k}{k!} \lambda^n \frac {e^{-\lambda x } x^{n-1}} {(n-1)!}dx \\ &=\lambda^n \sum_{k=0}^{m-1} \binom{k+n-1}{n-1}\mu^k\int_0^{\infty} \frac{e^{-(\lambda+\mu)x}x^{k+n-1}}{(k+n-1)!} dx \\ &=\lambda^n \sum_{k=0}^{m-1} \binom{k+n-1}{n-1}\mu^k \frac1{(\lambda+\mu)^{k+n}}\\ &=\sum_{k=0}^{m-1} \binom{k+n-1}{n-1} \left(\frac{\lambda}{\lambda+\mu}\right)^n\left(\frac{\mu}{\lambda+\mu}\right)^k. \ \ \ (*) \end{align} $$ This expression seems different, but this is equivalent to: $$ \sum_{k=n}^{n+m-1} {n+m-1 \choose k} \left(\frac{\lambda}{\lambda+\mu}\right)^k \left(\frac{\mu}{\lambda+\mu}\right)^{n+m-k-1}. \ \ \ \ (**)$$
Let $T\sim NB(n, \frac{\lambda}{\lambda+\mu})$. This is the time of $n$-th success, with probability of success in each trial is $\frac{\lambda}{\lambda+\mu}$. Let $S\sim B(n+m-1,\frac{\lambda}{\lambda+\mu})$. This is the number of successes in $n+m-1$ trials, with probability of success in each trial is $\frac{\lambda}{\lambda+\mu}$. Note that $$ n\leq T\leq n+m-1 \Longleftrightarrow n\leq S\leq n+m-1. $$ The formula $(*)$ is the expression for $P(n\leq T\leq n+m-1)$ and the second formula $(**)$ is $P(n\leq S\leq n+m-1)$. Thus, we must have $$ P(n\leq T\leq n+m-1) = P(n\leq S\leq n+m-1). $$ This shows that $(*)$ and $(**)$ are equivalent.
A special case $n=m=1$ is easier to calculate: In this case $X \sim \mathrm{Exp}(\lambda)$ and $Y\sim \mathrm{Exp}(\mu)$. Then $$ \begin{align} P(X<Y)&=\int_0^\infty P(x<Y) \lambda e^{-\lambda x} dx\\ &=\lambda \int_0^{\infty} e^{-(\lambda+\mu)x} dx = \frac{\lambda}{\lambda+\mu}. \end{align} $$ There is a reasoning that leads directly to one of the binomial or negative binomial distributions.
Let $\{A_t\}$ and $\{B_t\}$ be Poisson processes with rates $\lambda$ and $\mu$ respectively. Then we can consider the arrivals from $\{A_t\}$ as success and the arrivals from $\{B_t\}$ as failures. The probability of success is $P(X<Y)=\frac{\lambda}{\lambda+\mu}$. The event that $n$-th arrival time from $\{A_t\}$ is less than $m$-th arrival time from $\{B_t\}$ can be described as
The $n$-th success happens before the $m$-th failure. $\ \ \ \ \rm (I)$
The formulas $(*)$ and $(**)$ both represent the probability of the event $ \rm(I)$.