If $Y$ is Poisson with parameter $\lambda$ and $X \vert Y$ Is binomial with parameters $(y,p)$ find the distribution of $X$.

812 Views Asked by At

I was solving a problem where $Y$ is a random variable with Poisson distribution with parameter $\lambda$ and $X \vert Y$ is binomial with parameters $(y,p)$ I need to find the density function, the expected value and variance of $X$.

With the formulas $E(X)=E(E(X \vert Y))$ and $var(X)=E(var(X \vert Y))+var(E(X \vert Y))$ I know that $E(X)=var(X)=p \lambda$.

But my problem is the density function. With the formula $f_{X \vert Y}(x \vert y) =f_{X,Y}(x,y)/f_{Y}(y)$ I compute $f_{X,Y}$ and $f_{X}$ but the density function of $X$ gives me a Poisson with parameter $\lambda$ when I think it should be Poisson with parameter $\lambda p$ from the information of $E(X)$ and $var(X)$. What am I doing wrong?

3

There are 3 best solutions below

0
On BEST ANSWER

Evaluate \begin{align*} P[X = k] &= \sum_{j = 0}^\infty P[Y=j]P[X=k|Y=j]\\ &= \sum_{j = k}^\infty e^{-\lambda} \frac{\lambda^j}{j!}\binom{j}{k}p^k(1 - p)^{j - k} \\ &= \frac{e^{-\lambda}p^k \lambda^k}{k!}\sum_{j = k}^\infty \frac{\lambda^{j - k} (1 - p)^{j-k} }{(j-k)!} \\ &=\frac{e^{-\lambda} p^k \lambda^k}{k!} e^{\lambda(1 - p)} \\ &= \frac{e^{-\lambda p} (p\lambda)^k}{k!}\,. \end{align*}

0
On

Following your approach, lets try

$$ E(\exp(sX)) = E(E(\exp(sX)|Y)) = \sum_{i=0}^{\infty} E(\exp(sX)|Y=i)e^{-\lambda}\frac{\lambda^i}{k!} $$

Because, $$ E(\exp(sX)|Y=n) = \sum_{i=0}^{n} {n\choose{i}}p^iq^{n-i}e^{si} = \sum_{i=0}^{n} {n\choose{i}}(pe^{s})^i q^{n-i} = (pe^s+q)^n $$ The toatal sum becomes

$$ E(\exp(sX)) = \sum_{i=0}^{\infty}(pe^s+q)^ie^{-\lambda}\frac{\lambda^i}{k!} = \sum_{i=0}^{\infty}e^{-\lambda}\frac{(\lambda(pe^s+q))^i}{k!} $$ Use that this is essentially the taylor expansion of $e^x$ with matching $$ x = \lambda(pe^s+q), $$ and we conclude, $$ E(\exp(sX)) = e^{-\lambda + \lambda p e^s + \lambda q} = e^{-p\lambda + \lambda p e^s} $$ We identify this as the moment generating function of the poisson process.

This should with the help of a suitable theorem prove that $X\in Po(p\lambda)$

0
On

$Y \in Po(\lambda)$ means that for large $N$ so that we can say that this is almost the same as $Bin(\frac{\lambda}{N},N)$ which means that $$ Y = \sum_{i=1}^{N}I_i, $$ with $I_i$ i.i.d. and 1 with probability $\lambda / N$ and zero else. If we take $$ X = \sum_{i=1}^{N}J_iI_i, $$ with $J_i$ i.i.d and 1 with probability $p$ and 0 else. Then $X$ is $Bin(p\lambda / N,N)$ and given $Y=y$, $\in Bin(y,p)$ all prerequisites in the problem is now satisfied and using $Bin(p \lambda/N,N)\approx Po(p\lambda)$. letting $N \to \infty$ should indicate the answer.