Bayesian Statistics

79 Views Asked by At

Question: Given $N$, $X$ is distributed as $\mathrm{B}(N,\theta)$. Derive the unconditional distribution of X assuming N is distributed as $P(\lambda)$.

This is what I have tried so far:

$$x|N \sim \mathrm{B}(N,θ) \text{ so }\Pr(X=x|N) = \begin{pmatrix} N \\ x \end{pmatrix} \theta^x \,(1-\theta)^{N-x}$$

and also $$\Pr(N) = \exp(-\lambda)\dfrac{\lambda^N}{N!},$$

so the joint distribution $\Pr(X=x,N=n)= \begin{pmatrix} N \\ x \end{pmatrix} \theta^x \,(1-\theta)^{N-x} \cdot \exp(-\lambda)\dfrac{\lambda^N}{N!}$

then to find $Pr(X) = \sum_{n=0}^{+\infty}Pr(X=x,N=n).$

But how on earth do you this series if there's so many unknowns? And is there a trick (to recognize the distribution ?)

My professor has provided the answer as $X ~ \mathrm{Poisson} (\lambda\theta)$ so I think we have to recognize the joint as a the distribution of Poisson with parameter $\lambda\theta$. But I am not sure how to go about doing it. Help anyone?

3

There are 3 best solutions below

2
On

This is a standard property of the Poisson process. You may want to look it up in these notes as example 3.5.3.

0
On

Please let's not hope for `easy tricks'. They play their part in math but we must not be afraid of doing computations.

  1. First, noticed that you only need to care about the terms that contain $n$ in this sum. Everything that does not contain $n$ is "junk" you should cast outside the summation symbol (they multiply it).
  2. Also simplify everything that has $N$ on it. For example: write $\begin{pmatrix} N \\x\end{pmatrix}=\dfrac{N!}{(N-x)!x!}$ and simplify the expression. There are several othersimplifications you can do.
  3. You know what are the value of the expectation of a Poisson: $\sum_{k=0}^{+\infty} k\cdot \frac{\exp(-\lambda)\,\lambda^k}{k!}=\frac{1}{\lambda}$ and also that probabilities must add-up to one: $\sum_{k=0}^{+\infty} \frac{\exp(-\lambda)\,\lambda^k}{k!}=1$.
0
On

$$\Pr(X=x) = \sum_{n=0}^{+\infty}\Pr(X=x,N=n)=\sum_{n\geqslant x} {n\choose x} \theta^x \,(1-\theta)^{n-x} \cdot \mathrm e^{-\lambda}\dfrac{\lambda^n}{n!}$$ $$\Pr(X=x)= \frac{\theta^x}{x!} \lambda^x\mathrm e^{-\lambda}\sum_{n\geqslant x}\frac{(1-\theta)^{n-x}\lambda^{n-x}}{(n-x)!}= \frac{(\lambda\theta)^x}{x!}\mathrm e^{-\lambda}\mathrm e^{\lambda(1-\theta)}=\text{____}$$