CLT for functions of random variables

115 Views Asked by At

Let $X$ be a random variable with mean $\mu$ and variance $\sigma^2$ and $X_i$ with $i=1,...,n$ are a sequence of observations of $X$. The central limit theorem says $Z = \sqrt{n}\frac{\bar{X} - \mu}{\sigma} \sim \mathcal{N}(0,1)$ and Berry-Esseen theorem provides a more quantitative result for how fast $Z$ approaches to normal approximation. Now, my question is as follows: Are there any theorem or a bound that provides how fast a function of $X$, lets say $f(X)$ approaches to normal approximation?

Some extra information:

  • If it helps, $X$ may be assumed as a Gaussian random variable with zero mean.
  • I am asking this question because I want to evaluate (or bound) the error due to the Monte-Carlo estimation of $E[f(X)]$ (so I use Monte-Carlo simulations to evaluate the expectation and I want to bound the estimation error as a function of the number of samples).
  • The Delta theorem, from my understanding, is not useful here since I assume the mean is zero and for me $f'(0)$ is also zero. Moreover, the function is highly non-linear so I am not sure if Delta theorem would be useful even if $f'(0) \neq 0$.
  • Answering such a problem might be a bit hard, so a good reference (or which theorem/bound to look for) is more than enough.
1

There are 1 best solutions below

1
On BEST ANSWER

If $X$ is gaussian then $Z \sim N(0,1)$.

If $f'(0) = 0$ then delta-method also works: you need to use second term in Taylor series. If $\exists f''(0)$ then it doesn't matter if f is highly nonlinear.

You may use Edgeworth approximations also.