addition of two normally distributed random variables

71 Views Asked by At

I'm confused about why the addition of two randomly distributed random variables $\sim\mathcal{N}(\mu,\sigma)$ results in a new variable $y = x+x$ that $y\sim\mathcal{N}(2\mu,\sqrt{2}\sigma)$. However, multiplying a random varible by $2$ i.e., $y=2x$, where $x\sim\mathcal{N}(\mu,\sigma)$ will result into $y\sim\mathcal{N}(2\mu,2\sigma)$

Is that correct?

1

There are 1 best solutions below

0
On

This is absolutely correct. One can explain that by saying that the double of a r.v. is that variable added to itself. But while the former two are independent, the latter are perfectly dependent. This explains the discrepancy.