Multiplication of Gaussian pdfs

620 Views Asked by At

I have a sample mean given by:

$$S_n=\frac{1}{n}\sum_{i=1}^nX_i$$ Where $X_i$ are i.i.d. Gaussian random variable, i.e., each of them has pdf:

$$p(X_i=x_i)=\frac{1}{\sqrt{2\pi\sigma^2}} e^{-(x_i-\mu)^2/2\sigma^2}$$

The parameters $\mu, \sigma^2$ are, of course, mean and variance.

The pdf of $S_n$ is then given by:

$$p(S_n=s)=\sqrt{\frac{n}{2\pi\sigma^2}} e^{-n(s-\mu)^2/2\sigma^2}$$

How do I get this result? It is obvious that sum of Gaussian pdfs is also Gaussian but how does the $n$ goes into the square root and in the power of $e$?

2

There are 2 best solutions below

0
On BEST ANSWER

You can use the following Theorem:

If $X_1, X_2, ... , X_n$ are mutually independent normal random variables with means $μ_1, μ_2, ... , μ_n$ and variances $\sigma_1^2,\sigma_2^2,⋯,\sigma_n^2$, then the linear combination:

$$Y=\sum\limits_{i=1}^n c_iX_i$$

follows the normal distribution:

$$ N\left(\sum\limits_{i=1}^n c_i \mu_i,\sum\limits_{i=1}^n c^2_i \sigma^2_i\right) $$

A proof of that theorem is given here and uses moment generating functions.

For your problem this means $c_i = \frac{1}{n}$:

$\sum\limits_{i=1}^n c_i \mu_i = \sum\limits_{i=1}^n \frac{1}{n} \mu = \mu$

and

$\sum\limits_{i=1}^n c^2_i \sigma^2_i = \sum\limits_{i=1}^n \frac{1}{n^2} \sigma^2 = \frac{\sigma^2}{n}$

The resulting normal distribution is then

$$ S_n \sim N\left( \mu, \frac{\sigma^2}{n} \right) $$

This means your distribution for $S_n$ looks like the distribution for $X_i$ except that you have to replace $\sigma^2$ by $\sigma^2 / n$.

0
On

Note : Sum of Gaussian pdfs of $n$ random variables is not a Gaussian pdf, only the sum of those random variables is distributed according to Gaussain pdf. I will give a quick sketch to arrive at the result, you can fill up the gaps yourself.

Moment generating function of $S_{n}^{}$ i can be defined as : $$\mathcal{Z}(\lambda)=\int_{-\infty}^{+\infty} ds e^{i\lambda s} P[S_{n}^{}=s],$$ from which you can obtain $P[S_{n}^{}=s]$ as an inverse Fourier transform as : $$P[S_{n}^{}=s]=\frac{1}{2\pi}\int_{-\infty}^{+\infty} d\lambda e^{-i\lambda s} \mathcal{Z}(\lambda).$$ Now $P[S_{n}^{}=s]$ can be defined as : $$P[S_{n}^{}=s]=\langle \delta[s-\frac{1}{n}\sum_{k=1}^{n}s_{k}^{}] \rangle|_{\{s_{k}^{}\}}$$ where $\langle \rangle|_{\{s_{k}^{}\}}$ stand for average over distribution of all iid random variables. Now $$\mathcal{Z}(\lambda)=\langle e^{-i \frac{\lambda}{n} \sum_{k=1}^{n}s_{k}^{}} \rangle|_{\{s_{k}^{}\}}=\prod_{k=1}^{n}\langle e^{-i \frac{\lambda}{n} s_{k}^{}} \rangle|_{s_{k}^{}}=\Big[\langle e^{-i \frac{\lambda}{n} s} \rangle|_{s}\Big]^{n}_{}.$$ Here $$\langle e^{-i \frac{\lambda}{n} s} \rangle|_{s}=\int_{-\infty}^{+\infty} ds e^{-i \frac{\lambda}{n} s} P[S_{i}^{}=s].$$

Once you find $\mathcal{Z}(\lambda)$, you can inverse Fourier transform it to get $P[S_{n}^{}=s]$.