Joint sufficient statistics for normal distribution (denominator n - 1)

22 Views Asked by At

Example 24-6 here, for i.i.d. $X_i$ from a normal distribution $(\theta_1, \theta_2)$, expresses the joint density

$$ f(\textbf{x}; \theta_1, \theta_2) = \exp\bigg[\frac{-1}{2\theta_2}\sum_{i = 1}^n x_i^2 + \frac{\theta_1}{\theta_2}\sum_{i = 1}^n x_i - \frac{n\theta_1^2}{2\theta_2} - n\log\sqrt{2\pi\theta_2}\bigg]. $$

The second sentence that follows says $S_2 = \frac{Y_1 - Y_2^2 / n}{n - 1}$ where $Y_1 = \sum_{i = 1}^n X_i^2$ and $Y_2 = \sum_{i = 1}^n X_i$.

Where did $S_2$ come from?

Where did dividing by $n - 1$ come from?

1

There are 1 best solutions below

0
On BEST ANSWER

You have shown that $Y_1$ and $Y_2$ are jointly sufficient.

They want to claim that the familiar statistics $S_1 := \frac{1}{n} \sum_{i=1}^n X_i = \bar{X}$ (sample mean) and $S_2 := \frac{1}{n-1}\sum_{i=1}^n (X_i - \bar{X})^2$ (sample variance) are also jointly sufficient.

To see this, they show that there is a one-to-one correspondence between $(Y_1, Y_2)$ and $(S_1, S_2)$ via $S_1 = Y_2/n$ and $S_2 = \frac{Y_1 - (Y_2^2/n)}{n-1}$. This indirectly implies that the factorization theorem applies to $(S_1, S_2)$.