Wrong proof for the variance of a sum of normally-distributed variables?

59 Views Asked by At

I'm reading the book "Introduction to Error Analysis" by John R. Taylor.

The author is discussing the probability distribution of a sum of two normally-distributed random variables, and wants to show that if $x \sim N(0,\sigma_x^2)$ and $y\sim N(0,\sigma_y^2)$, then $x+y \sim N(0, \sigma_x^2 + \sigma_y^2)$.

He proceeds to prove this, but either there's something I'm seriously missing, or the proof is extremely hand-wavey (or just plain wrong). In particular, I don't see how you can take a term involving $x$ and $y$, call it $z$, and then conveniently integrate w.r.t. $z$ as if it were independent of $x$ and $y$.

I'm quoting the proof below, with some slight formatting modifications to make typing it out easier.

Thank you for your help.

$\Pr(x,y) \propto \exp\left[-\frac{1}{2}\left(\frac{x^2}{\sigma_x^2} + \frac{y^2}{\sigma_y^2}\right)\right]\quad\quad\quad$ (5.53)

Knowing the probability of obtaining any $x$ and $y$, we can now calculate the probability for any given value of $x+y$. The first step is to rewrite the exponent in (5.53) in terms of the variable of interest, $x+y$. This step can be done using the identity (which you can easily verify)

$ \frac{x^2}{A} + \frac{y^2}{B} = \frac{(x+y)^2}{A+B} + \frac{(Bx-Ay)^2}{AB(A+B)} \quad\quad\quad$ (5.54)

$= \frac{(x+y)^2}{A+B} + z^2 \quad\quad\quad (5.55) $

In the second line I have introduced the abbreviation $z^2$ for the second term on the right of (5.54) because its value does not interest us anyway.

If we substitute (5.55) into (5.53), replacing $A$ with $\sigma_x^2$ and $B$ with $\sigma_y^2$, we obtain:

$ \Pr(x,y) \propto \exp\left[-\frac{1}{2}\left(\frac{(x+y)^2}{(\sigma_x^2 + \sigma_y^2)}\right) - \frac{z^2}{2} \right]\quad\quad\quad$ (5.56)

This probability for obtaining given values of $x$ and $y$ can just as well be viewed as the probability of obtaining given values of $x+y$ and $z$. Thus, we can rewrite (5.56) as

$ \Pr(x+y,z) \propto \exp\left[-\frac{1}{2}\left(\frac{(x+y)^2}{(\sigma_x^2 + \sigma_y^2)}\right)\right] \exp\left[- \frac{z^2}{2}\right] \quad\quad\quad$ (5.57)

Finally, what we want is the probability of obtaining a given value of $x+y$ irrespective of the value of $z$. This probability is obtained by summing, or rather integrating, (5.57) over all possible values of $z$, that is,

$ \Pr(x+y) = \int \limits_{-\infty}^{\infty} \Pr(x+y,z)\,dz \quad\quad\quad (5.58) $

When we integrate (5.57) with respect to $z$, the factor $\exp(-z^2/2)$ integrates to $\sqrt(2\pi)$, and we find

$ \Pr(x+y) \propto \exp\left[-\frac{1}{2}\left(\frac{(x+y)^2}{(\sigma_x^2 + \sigma_y^2)}\right)\right] \quad\quad\quad $ (5.59)

1

There are 1 best solutions below

0
On BEST ANSWER

Believe it or not, this is pretty much correct. The more systematic mathematical way of doing it is as a formal change of variables: $$ W = X+Y\\Z = \frac{BX-AY}{\sqrt{AB(A+B}}.$$ The key thing Taylor glosses over here is that since the change of variables is linear, the Jacobian is just a number and only has a trivial effect on the joint distribution $f_{W,Z}.$ Then once you've used the change of variables formula to get $f_{W,Z}$ (up to an unimportant normalization constant), just integrate over $Z,$ to get the distribution for $W$ you're interested in. Like Taylor says, this integral just contributes another trivial constant $W$ and $Z$ wind up being independent.