probability random sample mean and variance

382 Views Asked by At

Suppose that $X_1, X_2, ..., X_n$ is a random sample from a distribution with mean $\mu$ and variance $\sigma^2$. Suppose also that $\nu:=E[(X_1 - \mu)^4] < \infty $.

(a) Find the mean and variance of $ V_n= \frac 1n \sum_{i=1}^\infty (X_i -\mu)^2 $.

(b) Now assume that the mean $\mu$ is known. Using the weak law of large numbers, explain why $V_n$ is a good estimator of the variance $\sigma^2$.

For part (a) can I write let $V_i = (X_i -\mu)^2$ and then use the theorem for the expectation and variance of a sample mean to say the expectation of $ V_n$ is $\mu$ and the variance is $\sigma^2$? This seems like it cannot be right but I am unsure.

For part (b) will I have to use Chebyshev's inequality at some point? Is that where the fact that $\nu:=E[(X_1 - \mu)^4] < \infty $ comes in?

Any help is appreciated!

1

There are 1 best solutions below

2
On BEST ANSWER

For (a), the idea of looking at the random variables $(X_i-\mu)^2$ is a good one. The choice $V_i$ for the name is potentially confusing.

Let's write $Y_i=(X_i-\mu)^2$. Then $V_n=\frac{1}{n}\left(Y_1+ Y_2+\cdots+Y_n\right)$.

The $Y_i$ are independent, so we will be able to find the mean and variance of $V_n$ once we know the mean and variance of the $Y_i$.

The mean of $Y_i$ should be easy to recognize, from the definition of the variance of $X_i$.

For the variance of the $Y_i$, we need to compute $E\left[(X_i-\mu)^4\right]-\left(E\left[(X_i-\mu)^2\right]\right)^2$. This is where $\nu$ comes into the game.

Now that we have found the mean and variance of the $Y_i$, calculate the mean and variance of $V_n$. The mean turns out to be $\sigma^2$. The variance will be something with an $n$ in the denominator. So for large $n$ the variance of $V_n$ is small. Then you can use the Chebyshev Inequality to draw conclusions. But depending on what theorems you have available to quote, you may not need to use the Chebyshev Inequality explicitly.