I'm trying to answer the following question:
let $(X_i)_{i=1}^n$ be a sequence of RV with mean $\mu$ and variance $\sigma^2$. Find the asymptotic distribution of $$H_n = (n^{-1}\sum_1^n X_i, n^{-1}\sum_i^n X_i^2)^T,$$ after suitable normalization.
I tried to subtract a vector of expected values, $(\mu, \sigma^2-\mu^2)^T$ and premultiply by $\sqrt n I_{2x2}$ to get vector $J_n$. Then, by Central Limit Theorem, marginal distribution of components will be normal. I then tried to use Cramer-Wold device to deduce asyptotic distribution of $J_n$, but struggled, since the components are not independent.
I would be grateful for ant clues! Thanks!
As you suggested, let's first subtract the respective means and do the rescaling. So we end up with $$ J_n=\left(\frac{1}{\sqrt{n}}(\sum_{i=1}^n X_i - n\mu), \frac{1}{\sqrt{n}}(\sum_{i=1}^n X_i^2 - n\mathbb{E}[X^2])\right)$$ Now a useful trick in order to understand the distribution of a random vector $Y$, is to study the distribution of $a\cdot Y$ for any $a$ vector fixed. So if $a=(a_1,a_2)$, we have $$ a\cdot J_n = \dfrac{1}{\sqrt{n}} \left(\sum_{i=1}^n \left( a_1X_i +a_2X_i^2\right) - n\mathbb{E}[a_1X+a_2X^2] \right)$$ Now by the CLT we know that this object converges to a gaussian distribution with mean $0$ and variance $Var(a_1X + a_2X^2)$, so we just need to compute the last quantity. But $$Var(a_1X + a_2X^2) = Var(a\cdot (X,X^2)) = a\cdot Cov (Z)a$$ where $Z$ is the random vector $(X,X^2)$. Therefore $J_n$ converges in distribution to a Gaussian vector with mean $0$ and covariance matrix $Cov(Z)$.