Why this test statistics for these distributions

116 Views Asked by At

For a random sample $X_1,\ldots,X_n$ from the Poisson distribution I get where the T-statistic comes from because for a Poisson $E(X) = \lambda$ and $V(X)=\lambda$ so it becomes:

$$T= \frac{\bar{X}-E(X)}{\sqrt{\frac{V(X)}{n}}} =\frac{\bar{X}-\lambda}{\sqrt{\frac{\lambda}{n}}},$$

where $\bar{X} = (X_1+\ldots,X_n)/n$.

But for the geometric distribution I have $E(X) = \frac{1}{p}$ and $V(X) = \frac{1}{p^2}$. And if I substitute:

$$T = \frac{\bar{X}-E(X)}{\sqrt{\frac{V(X)}{n}}} = \frac{\bar{X}-\frac{1}{p}}{\sqrt{\frac{1}{np^2}}} \neq \frac{\bar{X}-\frac{1}{p}}{\sqrt{\frac{1-p}{np^2}}}$$

The expression on the right is the real expression. Where does that $1-p$ in the numerator come from? How would I proceed with the binomial and exponential?

2

There are 2 best solutions below

1
On BEST ANSWER

The variance of a geometric distribution is not $1/p^2$. Suppose $X \sim \operatorname{Geometric}(p)$ with PMF $$\Pr[X = x] = p(1-p)^{x-1}, \quad x \in \{1, 2, \ldots \}.$$ (Note that the choice of support is irrelevant because the variance is independent of location.) Then recall the formulas

$$g_1(z) = \sum_{k=1}^\infty kz^{k-1} = \frac{d}{dz}\left[\sum_{k=1}^\infty z^k\right] = \frac{d}{dz} \left[\frac{1}{1-z}\right] = \frac{1}{(1-z)^2}, \tag{1}$$ $$g_2(z) = \sum_{k=1}^\infty k(k-1)z^{k-1} = z \frac{dg_1}{dz} = \frac{2z}{(1-z)^3}. \tag{2}$$

Therefore, $$\operatorname{E}[X] = \sum_{x=1}^\infty px(1-p)^{x-1} = p g_1(1-p) = \frac{p}{p^2} = \frac{1}{p}, \tag{3}$$ and $$\operatorname{E}[X(X-1)] = pg_2(1-p) = \frac{2(1-p)}{p^2}. \tag{4}$$

It follows that

$$\begin{align}\operatorname{Var}[X] &= \operatorname{E}[X^2] - \operatorname{E}[X]^2 \\ &= \operatorname{E}[X(X-1) + X] - \operatorname{E}[X]^2 \\ &= \operatorname{E}[X(X-1)] + \operatorname{E}[X] - \operatorname{E}[X]^2 \\ &= \frac{2(1-p)}{p^2} + \frac{1}{p} - \frac{1}{p^2} \\ &= \frac{1-p}{p^2}. \tag{5} \end{align}$$

0
On

It is not clear what is the purpose of your statistic but, note that if $X\sim \text{Geometric}(p)$, $\text{var}(X) = \frac{1-p}{p^2}$.

Furthermore, if $$X\sim\text{Binomial}(m,p)$$ then $E(X)=mp$ and $\text{var}(X)=mp(1-p)$. And if

$$X\sim\text{Exponential}(p),$$ then $E(X)=1/p$ and $\text{var}(X)=1/p^2$. So to get the expression of the statistic you can just replace the moments by their expressions.

As a side note, your $T$ statistic coincides with the Wald statistic based on the maximum likelihood estimator and its asymptotic variance. The nice thing about the Wald statistic is that it has asymptotic standard normal distribution, which you can use to do inference on your parameter of interest.