probability mass function of $Y-X$ (two Poisson random variables)

2.1k Views Asked by At

It is question 55 on page 86 from Ross's book(Introduction to Probability Theory)

Suppose that the joint probability mass function of $X$ and $Y$ is

$$ P(X = i, Y = j) = {j\choose i} e^{-2\lambda} \frac{\lambda^j}{j!}, \quad 0\le i\le j$$

(a) Find the probability mass function of $Y$

(b) Find the probability mass function of $X$

(c) Find the probability mass function of $Y - X$

My answers:

(b): $f_Y(j) = e^{-2\lambda} {(2\lambda)}^j/j!$

(a): $f_X(j) = e^{-\lambda} \lambda^j/j!$

for (c): to calculate $P\{Y - X = n\}$

suppose $x = k,$ $y = n + k,$

then

\begin{align} P\{Y - X = n\} & = P\{x = k, Y = n + k\} = P(X = k) P(Y = n + k) \\[10pt] & = \sum_{k=0}^n \left(e^{-\lambda} \frac{\lambda^k}{k!} \cdot e^{-2\lambda} \frac{(2\lambda)^{n+k}}{(n+k)!}\right) \end{align}

then I was stuck here.

Thanks in advance

Update 1: follow the suggestions by @Michael Hardy

  1. P( Y-X|Y ) = P(X|Y): Calculate this first(Why???)

Intuitively, Y - X and X complement each other when Y is given. $$ P(X = i |Y = j) = \sum_{j=0} {j\choose i} e^{-2\lambda} \frac{\lambda^j}{j!}$$ $$ =e^{-2\lambda}\frac{\lambda^j}{j!} \sum_{j=0} {j\choose i} $$

$$ P(Y-X=j-i |Y=j) = \sum_{j=0} {j\choose j-i} e^{-2\lambda} \frac{\lambda^j}{j!}$$ $$ =e^{-2\lambda}\frac{\lambda^j}{j!} \sum_{j=0} {j\choose j-i} $$

  1. Calculate P(Y-X).

suppose Y - X = n, x = i P(Y-X = n) = $$ \sum_{i=0}^n {i+n\choose i} e^{-2\lambda} \frac{\lambda^{i+n}}{(i+n)!} $$

= $$e^{-2\lambda}\frac{\lambda^{n}}{n!} \sum_{i=0}^n\frac{\lambda^i}{i!} $$

= $$e^{-2\lambda}\frac{\lambda^{n}}{n!} e^{\lambda} $$

= $$e^{-\lambda}\frac{\lambda^{n}}{n!} $$

it is the same as @Mohit 's result.

2

There are 2 best solutions below

1
On

Careful, $X, Y$ aren't necessarily independent. Your idea was right though:

$$ P(Y-X = n) = \sum_{k=0}^\infty P(x = k, Y = n + k) = \sum_{k=0}^\infty {n+k\choose k} e^{-2\lambda} \lambda^{n+k}/(n+k)! \\ =e^{-2\lambda}\lambda^{n}/n! \sum_{k=0}^\infty \lambda^{k}/k! = e^{-\lambda}\lambda^{n}/n!\\ $$

But this is just the Poission distribution with parameter $\lambda$.

4
On

$$\Pr(X = i, Y = j) = {j\choose i} e^{-2\lambda} \frac{\lambda^j}{j!}, \quad 0\le i\le j$$

\begin{align} \Pr(Y=j) = {} & \sum_{i=0}^j \Pr(X=i\ \&\ Y=j) \\[10pt] = {} & \sum_{i=0}^j \binom j i e^{-2\lambda} \frac{\lambda^j}{j!}. \\ & \text{In this sum, everything to the right of $\dbinom j i$ does} \\ & \text{not change as $i$ goes from $0$ to $j$. Therefore it can} \\ & \text{be pulled out:} \\[10pt] = {} & e^{-2\lambda} \frac{\lambda^j}{j!} \sum_{i=0}^j \binom j i \\[10pt] = {} & e^{-2\lambda} \frac{\lambda^j}{j!} \cdot 2^j \text{ by the binomial theorem} \\[10pt] = {} & e^{-2\lambda} \frac{(2\lambda)^j}{j!}. \\[15pt] \text{Therefore } Y \sim {} & \operatorname{Poisson}(2\lambda). \end{align}

\begin{align} \Pr(X=i) = {} & \sum_{j=i}^\infty \binom j i e^{-2\lambda} \frac{\lambda^j}{j!} \\ & \text{Notice that this sum starts with $j=i,$ not with} \\ & \text{$j=0$ or $j={}$something else, since we're told at} \\ & \text{the outset that $i\le j$.} \\[10pt] & = e^{-2\lambda} \sum_{j=i}^\infty \frac{\lambda^{j-i}}{(j-i)!} \cdot\frac{\lambda^i}{i!} \\ & \ldots\text{ and now $\lambda^i/(i!)$ does not change as $j$ goes from} \\& \text{$i$ to $\infty,$ so it can be pulled out:} \\[10pt] = {} & e^{-2\lambda} \frac{\lambda^i}{i!} \sum_{j=i}^\infty \frac{\lambda^{j-i}}{(j-i)!} \\[10pt] = {} & e^{-2\lambda} \frac{\lambda^i}{i!} \sum_{k=0}^\infty \frac{\lambda^k}{k!} \text{ where } k = j-i \\[10pt] = {} & e^{-2\lambda} \frac{\lambda^i}{i!} \cdot e^{\lambda} \\[10pt] = {} & e^{-\lambda} \frac{\lambda^i}{i!}. \\[15pt] \text{Therefore } X \sim {} & \operatorname{Poisson}(\lambda). \end{align}

To show that $Y-X\sim\operatorname{Poisson}(\lambda),$ first show that the conditional distribution of $Y-X$ given $Y$ is the same as the conditional distribution of $X$ given $Y.$ Then that conclusion follows.

It seems regrettable that the exercise didn't have a part $(d)$ in which you show that $X$ and $Y-X$ are actually independent.