I got the statement from a lecture and it says that the estimator of skewness and kurtosis are
$\hat{S}=\frac{1}{(n-1)\hat{\sigma}^3}\sum_{i=1}^n(x_i-\overline{x})^3$
$\hat{K}=\frac{1}{(n-1)\hat{\sigma}^4}\sum_{i=1}^n(x_i-\overline{x})^4$
while $\hat{\sigma}$ would be the sample estimator of SD.
The lecture states that
$\hat{S}\sim N(0,\frac{6}{n})$
$\hat{K}-3\sim N(0,\frac{24}{n})$
Assuming that the sample is iid normally distributed.
Is there a proof of this??
Thanks a lot!!
The following are some effort I've been put using CLM:
I assume that $E(x)=\mu,E((x-\mu)^2)=\sigma^2$ are known.
So for each random sample $x_i$ the term $\frac{x_i-\mu}{\sigma}\sim N(0,1)$ is standard normal.
Therefore $E\left(\left(\frac{x_i-\mu}{\sigma}\right)^3\right)=0$
and $Var\left(\left(\frac{x_i-\mu}{\sigma}\right)^3\right)=E\left(\left(\frac{x_i-\mu}{\sigma}\right)^6\right)=15$
So when CLM invokes I got a distribution of $\left(\frac{x_i-\mu}{\sigma}\right)^3\sim N(0,15)$ is that right?
It's not true, though it may be true asymptotically as $n \to \infty$.