How does standard deviation change when adding white noise?

287 Views Asked by At

Assume I have $N$ samples from a multivariate Gaussian distribution $c\mathcal{N}(\mu=0,\Sigma=I)$ where $I$ is the identity matrix and $c$ is a real scalar. Now assume that I am adding a small amount of white noise sampled from another multivariate Gaussian distribution $d\mathcal{N}_{noise}(\mu=0,\Sigma=I)$, where $d$ is a different scalar.

What is the rule for the growth of the covariances / the (isotropic) standard deviation?

I have made a few numerical experiments, and these are my results:

  • $c=1,d=0.1$: increase in $\sigma$ is on average $y=0.005$
  • $c=1,d=0.01$: increase in $\sigma$ is on average $y=0.00005$
  • $c=2,d=0.1$: increase in $\sigma$ is on average $y=0.0025$
  • $c=2,d=0.01$: increase in $\sigma$ is on average $y=0.000025$
  • $c=2,d=0.02$: increase in $\sigma$ is on average $y=0.0001$
  • $c=0.2,d=0.02$: increase in $\sigma$ is on average $y=0.001$

From the results I obtained, it seems like the equation is $$y=\frac{d^2}{2c}$$

Is this correct? If so, where does this result come from? Do you know of a formal way to derive this relation?

1

There are 1 best solutions below

0
On BEST ANSWER

If $X \sim N(0, c^2 I)$ and $Z \sim N(0, d^2 I)$ are independent, then $X+Z \sim N(0, (c^2 + d^2) I)$. The increase in the standard deviation is $$\sqrt{c^2 + d^2} - \sqrt{c^2} = c\left(\frac{\sqrt{c^2 + d^2}}{\sqrt{c^2}} - 1\right) = c\left(\sqrt{1 + \frac{d^2}{c^2}} - 1\right).$$

If you use the approximation $\sqrt{1+u}- 1 \approx \frac{1}{2} u$ for small $u$, the above expression is approximately $\frac{d^2}{2c}$, as you obtained. (But this approximation may be bad when $u=d^2/c^2$ isn't small.)