Proof increasing variance when subtracting random variables

474 Views Asked by At

Some students asked me why variances get added despite two random variables are subtracted. The example looked like this $$y(n)=x(n)-x(n-1)$$ where x(n) is assumed to be iid. We look for the expected value of $y$ and its variance.

I see why the variance has to be added. It is quite intuitive, but I had a hard time explaining it to them. Numerical examples worked, but is there a quick way how one could proof it? Some research on the internet showed me only that it is defined that way.

1

There are 1 best solutions below

2
On BEST ANSWER

$$ \mathop{\mathrm{Var}}(x-y)= \mathbb E(x-y)^2 - \Big(\mathbb E(x-y)\Big)^2=\\ \mathbb Ex^2-2(\mathbb Exy)+\mathbb Ey^2-(\mathbb Ex)^2+2(\mathbb Ex)(\mathbb Ey)-(\mathbb Ey)^2\\ $$

Since $x$ and $y$ are independent $\mathbb Exy=(\mathbb Ex)(\mathbb Ey)$. Finally: $$ \mathop{\mathrm{Var}}(x-y) = \mathbb Ex^2 -(\mathbb Ex)^2 + \mathbb Ey^2 -(\mathbb Ey)^2 = \mathop{\mathrm{Var}} x+ \mathop{\mathrm{Var}}y $$