It is question 55 on page 86 from Ross's book(Introduction to Probability Theory)
Suppose that the joint probability mass function of $X$ and $Y$ is
$$ P(X = i, Y = j) = {j\choose i} e^{-2\lambda} \frac{\lambda^j}{j!}, \quad 0\le i\le j$$
(a) Find the probability mass function of $Y$
(b) Find the probability mass function of $X$
(c) Find the probability mass function of $Y - X$
My answers:
(b): $f_Y(j) = e^{-2\lambda} {(2\lambda)}^j/j!$
(a): $f_X(j) = e^{-\lambda} \lambda^j/j!$
for (c): to calculate $P\{Y - X = n\}$
suppose $x = k,$ $y = n + k,$
then
\begin{align} P\{Y - X = n\} & = P\{x = k, Y = n + k\} = P(X = k) P(Y = n + k) \\[10pt] & = \sum_{k=0}^n \left(e^{-\lambda} \frac{\lambda^k}{k!} \cdot e^{-2\lambda} \frac{(2\lambda)^{n+k}}{(n+k)!}\right) \end{align}
then I was stuck here.
Thanks in advance
Update 1: follow the suggestions by @Michael Hardy
- P( Y-X|Y ) = P(X|Y): Calculate this first(Why???)
Intuitively, Y - X and X complement each other when Y is given. $$ P(X = i |Y = j) = \sum_{j=0} {j\choose i} e^{-2\lambda} \frac{\lambda^j}{j!}$$ $$ =e^{-2\lambda}\frac{\lambda^j}{j!} \sum_{j=0} {j\choose i} $$
$$ P(Y-X=j-i |Y=j) = \sum_{j=0} {j\choose j-i} e^{-2\lambda} \frac{\lambda^j}{j!}$$ $$ =e^{-2\lambda}\frac{\lambda^j}{j!} \sum_{j=0} {j\choose j-i} $$
- Calculate P(Y-X).
suppose Y - X = n, x = i P(Y-X = n) = $$ \sum_{i=0}^n {i+n\choose i} e^{-2\lambda} \frac{\lambda^{i+n}}{(i+n)!} $$
= $$e^{-2\lambda}\frac{\lambda^{n}}{n!} \sum_{i=0}^n\frac{\lambda^i}{i!} $$
= $$e^{-2\lambda}\frac{\lambda^{n}}{n!} e^{\lambda} $$
= $$e^{-\lambda}\frac{\lambda^{n}}{n!} $$
it is the same as @Mohit 's result.
Careful, $X, Y$ aren't necessarily independent. Your idea was right though:
$$ P(Y-X = n) = \sum_{k=0}^\infty P(x = k, Y = n + k) = \sum_{k=0}^\infty {n+k\choose k} e^{-2\lambda} \lambda^{n+k}/(n+k)! \\ =e^{-2\lambda}\lambda^{n}/n! \sum_{k=0}^\infty \lambda^{k}/k! = e^{-\lambda}\lambda^{n}/n!\\ $$
But this is just the Poission distribution with parameter $\lambda$.