How do they simplify this of sum of random variables?

62 Views Asked by At

In my textbook we are dealing with the following sum of random variables:

$X_{n}=\frac{1}{n}\sum_{i=1}^{n}e^{-\gamma _{i}^{2}}$

For a given positive integer n and where the $\gamma _{i}$ are iid samples of $Z ∼ N (0, 1)$. They then make the following equality:

$\lim_{n\rightarrow \infty }X_{n}=\mathbb{E}[e^{-Z^{2}}]$

Why is the above true?

1

There are 1 best solutions below

2
On BEST ANSWER

The idea is this: If you take a finite sample of size $n$ of a random variable, then the average of the sample will approach the expectation value of the random variable for large $n$.

The mathematical precise statement is this:

Let $Y$ be a random variable with finite expectation value and finite variance, and $y_1,y_2,\dots$ be independent samples of that distribution. Then it is \begin{align} \lim_{n\to\infty} \frac{1}{n}\sum_{i=1}^n y_i = \mathbb{E}(Y) \end{align} almost surely. (Here, "almost surely" means "with probability one").

So, there are actually two technicalities omitted in your question (maybe your textbook mentions them, maybe not):

  • The convergence is only "almost surely". That means it is "possible" that the equation does not hold, but that event has probability zero, so we dont care.
  • The equation only holds if the variable $Y$ (which in this case is $Y=e^{-Z^2}$) has finite variance. This is easy enough to show by simply calculating the variance explicitly.