Deriving distribution from conditional distribution

69 Views Asked by At

Hi guys I am having problems deriving $P(X = k)$ if $P(X = k|X+Y = n)$ = ${n}\choose{k}$ $\times$ $2^{-n} $

X and Y are i.i.d. random variables with values in $\mathbb{N_0}$.

After playing a bit with the formula and using independency of the variables I get to:

$P(X = k)$ = ${n}\choose{k}$$\times$ $2^{-n}$$\times$ $\frac{P(X+Y = n)}{P(Y = n-k)}$

and I could theoretically rewrite $ P(X+Y = n)$ = $ \sum_{k = 0}^{n}P(X = k) P(Y = n-k)$, although I am not sure whether the indexing is correct and it does not really help me to get further.

I know from an exercise that at some point I should get to an induction, but I just do not know how. Thank you for any help.

1

There are 1 best solutions below

0
On

Let $p_n=P(X=n)$. Consider $P(X=n|X+Y=n)=\frac{P(X=n)P(Y=0)}{P(X+Y=n)}=2^{-n},$ which together with $P(X+Y=n)=\sum_{k=0}^n p_kp_{n-k}$ implies $$ p_np_0=2^{-n}\sum_{k=0}^n p_kp_{n-k}.\tag1 $$ Using $(1)$, and induction, you can prove that for all $n\ge 0$, that $$ p_n=p_0\cdot \frac{(p_1/p_0)^n}{n!}.\tag{2} $$ Summing $(2)$ over all $n\ge 0$, you get $$ \sum_n p_n=1=p_0\sum_n \frac{(p_1/p_0)^n}{n!}=p_0e^{p_1/p_0}\implies p_1=-p_0\log p_0. \tag3 $$ Therefore, combining $(3)$ and $(2)$, $$ p_n=p_0\frac{(-\log p_0)^n}{n!}. $$ Therefore, the distribution of $X$ is determined up to a free parameter, $p_0$. Note that $X$ has a Poisson$(\lambda)$ distribution with $\lambda=-\log p_0$.


In case you have trouble proving $(2)$, note that the base cases $n=0,1$ are immediate, and for all $n\ge 2$, the inductive step is as follows (hidden behind spoilers in case you want to find it yourself):

\begin{align}p_np_0&=2^{-n}\cdot 2p_0p_n+2^{-n}\sum_{k=1}^{n-1} p_kp_{n-k}\\&=2^{-n+1}p_0p_n+2^{-n}\sum_{k=1}^{n-1} p_0^2 \frac{(p_1/p_0)^k}{k!}\frac{(p_1/p_0)^{n-k}}{(n-k)!}\\&=2^{-n+1}p_0p_n+2^{-n}\frac{p_0^2(p_1/p_0)^n}{n!}\sum_{k=1}^{n-1}\binom{n}k\\&=2^{-n+1}p_0p_n+2^{-n}\frac{p_0^2(p_1/p_0)^n}{n!}\cdot (2^n-2), \end{align}which simplifies to $(2)$.