Trying to find consistency of an estimator: Stat Theory

63 Views Asked by At

So I have a homework problem that my study group and I are stuck on.

The problem goes,"Let $X$ be a random variable with mean $\mu$, and variance $\sigma_2$, and we have a sample $\{X_1, X_2 , \ldots , X_n \}$.

Show that $T= \frac{2}{n(n+1)}\sum_iiX_i$ is a consistent estimator for $\mu$.

Our professor gave us three hints, and the first one we are struggling with is to find the mean and variance of $T$.

We are assuming that $T$ is a discrete random variable since it is a summation.

To find $$E[X] = \sum xp(x)$$ $$= xT$$ $$= \sum x\left[\frac{2}{n(n+1)}\sum_iiX_i\right]$$ $$= \sum x\left[\frac{2}{n(n+1)}\cdot\frac{n(n+1)}{2}\sum_iX_i\right]$$ Since the sum of the $i$'s is $\frac{n(n+1)}{2}$ $$= \sum x\left[\sum_iX_i\right]$$ $$= \sum x(\overline{x}n)$$ If we manipulate the summation $\frac{X_i}{n}=\overline{x}$ formula around to make $n\overline{x}=\sum_iX_i$ $$=\overline{x}\sum xn$$

This is pretty much where we get stuck at and we begin to question whether or not we're going down the right bunny trail with this one.

After speaking with some of my other classmates, there is another thought about on how to set up this problem:

$$= \sum iT$$ $$= \sum i\left[\frac{2}{n(n+1)}\sum_iiX_i\right]$$ $$= \sum i\left[\frac{2}{n(n+1)}\cdot\frac{n(n+1)}{2}\cdot\sum_iX_i\right]$$ Since the sum of the $i$'s is $\frac{n(n+1)}{2}$ $$= \sum[iX_i]$$

This pretty much leaves us with $\frac{n(n+1)}{2}\sum_iX_i$, and I could throw in the mean there again, but I doubt that is going to work for this problem.

Could someone check our logic to see if we are thinking about this problem correctly or if we need to change our approach on how we should calculate this mean?

Thanks!!

1

There are 1 best solutions below

8
On

Consider the random variable $W=\sum iX_i$. By the linearity of expectation, $$E(W)=E\left(\sum_1^n i X_i\right)=\sum_1^n i E(X_i)=\left(\sum_1^n i\right)\mu.$$ But the sum $1+2+3+\cdots +n$ is $\frac{n(n+1)}{2}$. Since $T=\frac{n(n+1)}{2}W$, it follows that $E(T)=\mu$.

For the variance, recall that if $U$ and $V$ are independent random variables, then $\text{Var}(aU+bV)=a^2\text{Var}(U)+b^2\text{Var}(V)$.

There is no reason to think of $T$ as discrete. If the distribution of the $X_i$ is continuous, then so is the distribution of $T$.