Joint probability mass function of iid random variables

668 Views Asked by At

Let $X_1 \dots X_n$ be iid random variables taking values $0, 1$ and $2$. Let $Y_i, i \in \{0,1,2\}$ be the number of $X_j$'s satisfying $X_j =i$. I am trying to find the joint distribution for $(Y_0, Y_1, Y_2)$.

One can observe that: \begin{align*} P(Y_0 = k, Y_1 = l, Y_2= m) &= P(Y_0 = k, Y_1 = l, Y_2= n - l -k)\\ &= P(X = 0)^kP(X=1)^lP(X=2)^{n-k-l} \end{align*}

I know what $P(X=i)$ is so I can take it from there. I am just wondering if I have skipped some important detail in the process above. Any comments will be greatly appreciated!

UPDATE:

Using Graham's suggestion I was able to find the mle of $0 \le \theta \le 1$, a parameter which the aforementioned probabilities are a function of. That mle came out to be:

$$\displaystyle \hat{\theta}_{mle} = \frac{2n-2k-l}{2n-k}$$

I am now trying to compute the asymtpotic variance of this mle, using the Fisher information, namely the negative expectation of the second derivative of the likelihood function:

\begin{align*} \displaystyle -\mathbb{E} \left ( l''(\theta; \bf{x} )\right ) &= -\mathbb{E} \left ( \frac{-k - l}{(1-\theta)^2} - \frac{2n-2k-l}{\theta^2} \right ) \end{align*}

Only $k$ and $l$ are random are of interest here since everything else is a constant for the expectation. Recall, that $k$ is the number of $X_j$s equal to $0$ from $n$ iid random variables (see above). So, the question now becomes, is there a quick way of computing $\mathbb{E}(Y_0)$?

1

There are 1 best solutions below

2
On BEST ANSWER

Recall how a binomial distribution's pmf is established.

You have a multinomial distribution.

$${\mathsf P(Y_0{=}k, Y_1{=}l, Y_2{=}n{-}k{-}l)} ~=~ {\binom n{k,l,n-k-l}\,\mathsf P(X_i{=}0)^k\,\mathsf P(X_i{=}1)^l\,\mathsf P(X_i{=}2)^{n-k-l}}$$


PS: that's the multinomial cooeficient: $\binom n{k,l,n-k-l}= \frac{n!}{k!~l!~(n-k-l)!}$


PPS: We should also note down the support; $k\in\Bbb N, l\in\Bbb N, k+l\leq n$