Can it ever be that for a random sample $X_1, ..., X_n$ we have that $\frac{1}{n}\Sigma_{i=1}^n x_i^2 \lt (\frac{1}{n}\Sigma_{i=1}^n x_i)^2$

59 Views Asked by At

I have had a homework problem about using method of moments for estimating a uniform random variable.

I probably made a calculation mistake because as was pointed out to me, we've shown in some exercise via the cauchy inequality that: $\frac{1}{n}\Sigma_{i=1}^n x_i^2 \geq (\frac{1}{n}\Sigma_{i=1}^n x_i)^2$

But I can come up with examples and I distinctly remember having read somewhere that it's actually a pitfall of using the method of moments in estimating variance when the opposite is true. i.e.

$\frac{1}{n}\Sigma_{i=1}^n x_i^2 \lt (\frac{1}{n}\Sigma_{i=1}^n x_i)^2$ which would result in negative variance which does not exist. So I am a bit confused, can this case even ever occur or am I mixing up something?

thank you for your time and help!