Let $(X_n)_n$ be a sequence of i.i.d random variables taking values in $\mathbb{R}^d,d \geq 1,$ such that $E[||X||^2]<+\infty$ where $||.||$ is a norm on $\mathbb{R}^d,$ and let $\overline{X}_n=\frac{1}{n}\sum_{k=1}^nX_k,$ we already know that $\sqrt{n}(\overline{X}_n-E[X_1])$ converges in distribution to $N(0,K)$ where $K$ is the covariance matrix of $X_1.$ If $f:\mathbb{R}^d \rightarrow \mathbb{R}^p,p \geq 1$ is a function.
I am searching conditions on $f$ such that $\sqrt{n}(f(\overline{X}_n)-f(E[X_1]))$ converges in distribution, $f$ should be differentiable on $E[X_1]$ ? Is the existence of the first partial derivatives of $f$ on $E[X_1]$ sufficient?
Delta-method to the rescue https://fr.wikipedia.org/wiki/M%C3%A9thode_delta#Cas_multivari%C3%A9.
Let $\theta \in \mathbb R^n$ and suppose $Y_1,\ldots,Y_n$ are random vectors in $\mathbb R^d$. Suppose further that
Then $\sqrt{n}(f(Y_n)-f(\theta)) \overset{L}{\longrightarrow} N_{d'}(0,\nabla g(\theta)^TK\nabla g(\theta))$.
Now, use the above result with $\theta := E[X_1]$ and $Y_n := \overline{X}_n$.