Variance of product of Gaussian random variables

347 Views Asked by At

Suppose I have $r = [r_1, r_2, ..., r_n]$, which are iid and follow normal distribution of $N(\mu, \sigma^2)$, then I have weight vector of $h = [h_1, h_2, ...,h_n]$, which iid followed $N(0, \sigma_h^2)$, how can I calculate the $Var(\Sigma_i^nh_ir_i)$? suppose $h, r$ independent.

How should I deal with the product of two random variables, what is the formula to expand it, I am a bit confused.

2

There are 2 best solutions below

3
On BEST ANSWER

First just consider the individual components, which are gaussian r.v., call them $r,h$, $$r\sim N(\mu,\sigma^2),h\sim N(0,\sigma_h^2)$$ $$ Var(rh)=\mathbb E(r^2h^2)-\mathbb E(rh)^2=\mathbb E(r^2)\mathbb E(h^2)-(\mathbb E r \mathbb Eh)^2 =\mathbb E(r^2)\mathbb E(h^2) $$ Under the given conditions, $\mathbb E(h^2)=Var(h)=\sigma_h^2$

$$ \mathbb E(r^2)=\mathbb E[\sigma^2(z+\frac \mu\sigma)^2]\\ = \sigma^2\mathbb E(z+\frac \mu\sigma)^2\\ =\sigma^2\mathbb E[z^2+2\frac \mu\sigma z+\frac {\mu^2}{\sigma^2}]\\ =\sigma^2+\mu^2 $$ $z\sim N(0,1)$ is standard gaussian random variables with unit standard deviation. Note the non-central Chi sq distribution is the sum $k $independent, normally distributed random variables with means $\mu_i$ and unit variances. Then $r^2/\sigma^2$ is such an RV.

Put it all together. $$ Var(r^Th)=nVar(r_ih_i)=n \mathbb E(r_i^2)\mathbb E(h_i^2) = n(\sigma^2 +\mu^2)\sigma_h^2 $$

If we are not too sure of the result, take a special case where $n=1,\mu=0,\sigma=\sigma_h$, then we know $$ Var(rh)=\mathbb E(r^2h^2)=\mathbb E(r^2)\mathbb E(h^2) =Var(r)Var(h)=\sigma^4 $$ which equals the result we obtained above.


I largely re-written the answer. The post that the original answer is based on is this.

Is the product of two Gaussian random variables also a Gaussian?

I found that the previous answer is wrong when $\sigma\neq \sigma_h$ since there will be a dependency between the rotated variables, which makes computation even harder. The answer above is simpler and correct.

7
On

The first thing to say is that if we define a new random variable $X_i$=$h_ir_i$, then each possible $X_i$,$X_j$ where $i\neq j$, will be independent.

Therefore, we are able to say

$$Var \Big(\sum_i^nX_i \Big)=\sum_i^nVar(X_i)$$

Now, since the variance of each $X_i$ will be the same (as they are iid), we are able to say

$$\sum_i^nVar(X_i)=nVar(X_1)$$

So now let's pay attention to $X_1$. We know that $h$ and $r$ are independent which allows us to conclude that

$$Var(X_1)=Var(h_1r_1)=E(h^2_1r^2_1)-E(h_1r_1)^2=E(h^2_1)E(r^2_1)-E(h_1)^2E(r_1)^2$$

(by Fubini's Theorem).

We know that $E(h_1)=0$ and so we can immediately eliminate the second term to give us

$$Var(h_1r_1)=E(h^2_1)E(r^2_1)$$

And so substituting this back into our desired value gives us

$$\sum_i^nVar(X_i)=nE(h^2_1)E(r^2_1) $$

Using the fact that $Var(A)=E(A^2)-E(A)^2$ (and that the expected value of $h_i$ is $0$), we note that for $h_1$ it follows that

$$Var(h_1)=E(h^2_1)=\sigma^2_h$$

And using the same formula for $r_1$, we observe that

$$Var(r_1)=E(r^2_1)-\mu^2=\sigma^2$$

Rearranging and substituting into our desired expression, we find that

$$\sum_i^nVar(X_i)=n\sigma^2_h (\sigma^2+\mu^2)$$

Note: the other answer provides a broader approach, however, by independence of each $r_i$ with each other, and each $h_i$ with each other, and each $r_i$ with each $h_i$, the problem simplifies down quite a lot.