Poisson distribution of tails on a coin.

338 Views Asked by At

A random variable, X, is defined as follows. First flip a coin with a probability of p of heads. If the coin lands heads, then define X as X=0. If the coin lands tails, then X has a Poisson distribution, Poi(λ).

  1. Find PMF of X

For this problem I had that if the first flip was a heads, then P(X=0)=p, since I interpreted X as the number of tails. But I'm not sure about that. If the first flip was a tails, then I said it would be the poisson PMF formula: $e^{-λ}$ $λ^k$/k!

but I wasn't sure if there should be two separate PMFs or there should only be one.

  1. let Y~Poi(λ) and let U~Ber(p) that is independent of Y. Let Z = (1-U)Y. Show that Z and X have the same PMF.

I'm a bit lost at this problem. I know that U~Ber(p) = p and 1-p, and that Y~Poi(λ) = $e^{-λ}$ $λ^k$/k!. I'm not sure where to proceed.

  1. We know that E(Y) = Var(Y) = λ. Using this and part 2, find E(X) and Var(X). You can also use the fact that if V and W are independent random variables, then E(VW) = E(V)*E(W)

Again, not sure about part 2, so I'm also not sure about this problem.

1

There are 1 best solutions below

0
On

but I wasn't sure if there should be two separate PMFs or there should only be one.

There should only be one, and you should combine the disjoint cases. Your instinct here is correct.

$$\mathsf P(X=k)=\begin{cases}\mathsf P(\text{heads or, tails and Poisson count of }0)&:& k=0\\\mathsf P(\text{tails and Poisson count of }k)&:& k\in\Bbb N~, k>0\\0&:&\text{otherwise}\end{cases}$$

Part 1 and Part 2 are really just asking the same problem, with part 2 being a little more formal about it.

I know that U~Ber(p) = p and 1-p, and that Y~Poi(λ) = e−λ λk/k!. I'm not sure where to proceed.

Not quite. More specifically, what the distributions tell you are the probability mass functions for the variables, which may be combined to derive that for $Z$

Let $U\sim\mathcal{Bern}(p)$ be the random variable indicating the count for heads.$$\mathsf P(U=\theta)=p^\theta~(1-p)^{1-\theta}\quad\big[\theta\in\{0,1\}\big]$$

Let $Y\sim\mathcal{Pois}(\lambda)$ be the poison random variable.

$$\mathsf P(Y=k)=\dfrac{\lambda^k\mathrm e^{-k}}{k!}\quad\big[k\in\Bbb N\big]$$

Then we have that $Z= (1-U)~Y$ , and so $Z=0$ when the coin shows heads, or else when it shows tails and the Poisson count is zero. Likewise, $Z$ is nonzero only when the coin shows tails and the Poisson count is nonzero too. Additionally, $U$ and $Y$ are independent.

This is exactly as is the case for $X$, so $X$ and $Z$ are identically distributed.

$$\begin{align}\mathsf P(X\,{=}\,k)&=\mathsf P(Z\,{=}\,k)\\[2ex]&=\begin{cases}\mathsf P(U\,{=}\,1\cup(U\,{=}\,0\cap Y\,{=}\,0))&:& k=0\\[1ex]\mathsf P(U\,{=}\,0\cap Y\,{=}\,k)&:&k\in\Bbb N{\smallsetminus}\{0\}\\[1ex]0 &:&\text{otherwise}\end{cases}\\[2ex]&=\begin{cases}\mathsf P(U\,{=}\,1)+\mathsf P(U\,{=}\,0)~\mathsf P(Y\,{=}\,0)&:& k=0\\[1ex] \mathsf P(U\,{=}\,0)~\mathsf P(Y\,{=}\,k)&:&k\in\Bbb N\smallsetminus \{0\}\\[1ex]0 &:&\text{otherwise}\end{cases}\\[2ex]&~~\vdots\end{align}$$

Just fill in the ellipsis.

Again, not sure about part 2, so I'm also not sure about this problem.

Here you just need to know the mean and variance for Bernoulli and Poisson variables, which you were told.

$${\mathsf E(U)=p\\\mathsf {Var}(U) = p(1-p)\\\mathsf E(Y)=\lambda\\\mathsf {Var}(Y)=\lambda}$$

Everything else is just applying the definitions for expectation and variance and hint you were given. Note: since $U$ and $Y$ are independent, so too are $(1-U)$ and $Y$, and also $(1-U)^2$ and $Y^2$ so the hint tells you:

$${\mathsf{E}((1-U)Y)=\mathsf E(1-U)~\mathsf E(Y)\\\mathsf E((1-U)^2 Y^2)=\mathsf E((1-U)^2)~\mathsf E(Y^2)}$$