Proving that number of heads and tails are independent random variables.

510 Views Asked by At
  1. Let $X$ be a random variable that follows a Poisson distribution of parameters $m$ and let $Y$ be random variable which the conditioned law with $X = n$ follows a binomial distribution with parameters $n,p$.

Prove that: $$ p(Y = k) = \frac{(pm)^k e^{-mp}}{k!} $$

and deduce the nature of $Y$.

During a class, a bored student watches from window the leaves falling from a tree, we admit the number of the leaves fallen from the tree by the end of the class is a random variable $X$ that follows a Poisson distribution of parameter $\lambda$:

$$p(X = k) = \frac{\lambda^k}{k!}e^{-\lambda}$$

  1. From the hypotheses above, Why would we conclude that : $e^\lambda = \sum^{+ \infty}_{k = 0} \frac{\lambda^k}{k!}$

  2. Compute $E(X)$ and $V(X)$.

  3. Everytime a leaves falls on the ground, the student flips a coin which gives tails with a probability $p$ and heads with probability $q = 1 - p$, $p \in (0,1)$.

We note $T$ and $H$ the number of tails and heads obtained respectively.

3.1. For $k$ fixed, explain why the distribution of $T$ knowing $X = k$ is a binomial distribution and deduce the expression of $ p(X = k|H = a)$.

3.2 For $(a,n) \in \mathbb{N}^2$, compute $p(X = k, H = a)$.

3.3 Deduce the law of $H$ and its expected value.

3.4 Without computation, determine the law $T$.

3.5 Prove that $H$ and $T$ are independents.


EDIT

After the help of David K, I decided to edit the post and write the answers of the questions I have been stuck with.

3.2. Computing $p(H = a, X = k)$ :

We know that:

$ p(H = a| X = k) = \frac{p(H = a, X = k)}{p(X = k)} $

$X$ follows a Poission distribution of parameter $\lambda$, we also know that the number of heads from $k$ flips is a binaomial random variable, so we will have:

$$ \begin{align} p(H = a, X = k) & = p(X = k). p(H = a| X = k) \\ & = \frac{\lambda^k}{k!}e^{-\lambda}C^{a}_k p^k (1-p)^{k - a} \end{align} $$

3.3. Applying the result of question 1, we get: $$p(H = a) = \frac{(q\lambda)^{a}}{a!}e^{-q\lambda} $$

$H$ follows a Poisson distribution of parameter $(q\lambda)$, its expected value is $(q\lambda)$.

3.4. T is the number of tails from $k$ flips, it is also a binomial distribution, applying the result of question 1 again, with the probability of $T$ is $p$, we get :
$$ p(T = b) = \frac{(p\lambda)^{b}}{b!}e^{-p\lambda} $$

3.5. $H$ and $T$ are independents:

$$ \begin{align} p(H = a, T = b) & = p(H = a, X = a + b) \\ & = p(H = a| X = a + b).p(X = a + b) \\ & = C^{a}_{a+b} p^a (1-p)^{a+b-a}. \frac{\lambda^{a+b}}{(a+b)!}e^{-\lambda} \\ & = \frac{(a + b)!}{a! b!} p^a q^b \frac{\lambda^a \lambda^b}{(a+b)!} e^{-\lambda( p + q )} \\ & = \frac{(q\lambda)^a}{a!} e^{- q\lambda} . \frac{(p\lambda)^b}{b!} e^{- p\lambda} \\ & = p(H = a).p(T = b) \end{align} $$

2

There are 2 best solutions below

1
On BEST ANSWER

For 3.2, you are asked to find $P(X=k, H=a),$ not $P(X=a,H=a).$ I think if you account for the $k$ and $a$ more carefully you will get a different answer than currently shown at the time I write this.

For 3.3, it is true that you can apply the result of problem 1, but for $X$ the parameter of the Poisson distribution is $\lambda$ instead of $m$ (it looks like you understood--or guessed--that part correctly) and when $X=n$ then $H$ has a binomial distribution with parameters $n,p$ -- the same statement that is made about $Y$ in problem 1. So you are correct to substitute $\lambda$ for $m,$ but you should not substitute anything for $p$ (or to be pedantic, you should substitute the variable named $p$ from problem 3 for the variable named $p$ in the problem 1, but the effect on the notation is the same).

Also for 3.3, you can either write $P(H=k)$ so you are using the variable $k$ in the exact same way as in problem 1, or you can write $P(H=a)$ and substitute $a$ where $k$ was used in problem 1. Or you can use some other variable instead of either $k$ or $a$, as long as you don't write $P(H=\lambda)$ or $P(H=p)$ (since in those cases you would be using one name for two different things).

For 3.4, think! Is it possible that $H$ and $T$ are the same, and that $H=a$ and $T=a$? Is it possible that neither $H$ nor $T$ is equal to $a$, for example $H=a-1$ and $T=a+1$? Then by what logic can you possible say that $P(T=a) = 1 - P(H=a)$?

For 3.5, you have the wrong notation. $P(H.T)$ doesn't mean anything. What you want to show is $$ P(H = a, T = b) = P(H=a) P(T=b). $$

The event $H=a, T=b$ is exactly the same event as $X=a+b, H=a.$ So if you have found the correct formulas for $P(X=k,H=a),$ for $P(H=a),$ and for $P(T=b),$ this part of the question almost answers itself.

1
On

If $X\sim\mathrm{Pois}(m)$ and $Y$ conditioned on $\{X=n\}$ has $\mathrm{Bin}(n,p)$ distribution then for any nonnegative integer $k$, \begin{align} \mathbb P(Y=k) &= \sum_{n=k}^\infty \mathbb P(Y=k\mid X=n)\mathbb P(X=n)\\ &= \sum_{n=k}^\infty \binom nk p^k(1-p)^{n-k} e^{-m}\frac{m^n}{n!}\\ &= \left(\frac p{1-p}\right)^k \frac{e^{-m}}{k!}\sum_{n=k}^\infty\frac{(m(1-p))^n}{(n-k)!}\\ &= \left(\frac p{1-p}\right)^k \frac{e^{-m}}{k!}\sum_{n=0}^\infty\frac{(m(1-p))^{n+k}}{k!}\\ &= (mp)^k\frac{e^{-m}}{k!}\sum_{n=0}^\infty \frac{(m(1-p))^n}{n!}\\ &=(mp)^k\frac{e^{-m}}{k!} e^{m(1-p)}\\ &= e^{-mp}\frac{(mp)^k}{k!}, \end{align} so that $Y\sim\mathrm{Pois}(mp)$.

For the second question, if $\mathbb P(X=k)=\frac{\lambda^k}{k!}e^{-\lambda}$ is a probability distribution over the nonnegative integers, then we must have $\sum_{k=0}^\infty \mathbb P(X=k)=1$, from which $\sum_{k=0}^\infty \frac{\lambda^k}{k!}=e^\lambda$. The expectation of $X$ is $$ \mathbb E[X] = \sum_{k=1}^\infty k \frac{\lambda^k}{k!}e^{-\lambda} = \sum_{k=1}^\infty\frac{\lambda^k}{(k-1)!}e^{-\lambda} = \lambda\sum_{k=0}^\infty \frac{\lambda^k}{k!}e^{-\lambda} = \lambda, $$ and the expectation of $X(X-1)$ is $$ \mathbb E[X(X-1)] = \sum_{k=2}^\infty k(k-1)\frac{\lambda^k}{k!}e^{-\lambda} = \lambda^2\sum_{k=0}^\infty \frac{\lambda^k}{k!}e^{-\lambda} = \lambda^2. $$ Hence the variance of $X$ is $$ \mathrm{Var}(X) = \mathbb E[X^2] - \mathbb E[X]^2 = \mathbb E[X(X-1)] + \mathbb E[X] - \mathbb E[X]^2 =\lambda^2+\lambda-\lambda^2 = \lambda. $$

For 3), it is not clear to me what the relation between $X$ and $H,T$ is. Without that information, I cannot answer the question.