I am sorry if this question is vague since I am completely unfamiliar with probability theory.
Suppose that we have a family of real-valued random variables $X_n$ (say, all of them have mean 0) on some probability space and we would like to show that $X_n$ converges to Gaussian weakly.
What are the standard/general techniques of showing such convergence to Gaussian distribution?
I am aware of the following two:
- Moments. Check that $E[X_n^k]$ converges to the $k$-th moment of Gaussian for all $k \in \mathbb{N}$.
- Work with characteristic functions instead. This seems to be the method to prove for example, the classical central limit theorem.
- Stein's method also seems to be common in practice these days, although I am not sure if this is a real workhorse in probability theory.
and I just want to know if this is how people usually approach this in probability theory, or there are some other general approaches as well.
Along the same lines, I also want to know about the techniques of showing convergence of complex random variables to the complex Gaussian. This can probably be reduced to the real case by separating the real/complex part and checking their covariance, but I would be curious to know if there is some uniform way of viewing this as well.
Thank you!
I think the three methods you listed pretty much cover it.
However I'd like to stress that Stein's method is, to use your terms, a real workhorse in probability theory, especially in stochastic geometry. There are two advantages to Stein's method:
I still remain rather unhappy with all these methods of proving convergence to the normal law: all seem to tackle the problem indirectly and in my opinion do not help understand what is that makes the normal law a distribution that appears everywhere where a sum of (more or less) independent variables occurs.
I don't have any satisfying answer for your question regarding the complex Gaussian.