Show that the variance of the sum $x + y$ of the random variable $x$ and the random variable $y$ is the sum of the variance $\sigma_x^2$ of $x$ and the variance $\sigma_y^2$ of $y$?
My attempt:
Is my first time doing statistics and there are alot of terms i am quite unfamiliar with.
Random variable: Value of a variable result from a random experiment: e.g. (the score when a die is rolled once)
Variance($\sigma^2$) : Average of squared deviations from the mean or square of standard deviation?
I know variance of sum $x + y$ means $var(x+y) = $E(x+y)^2 -$(E(x+y))^2$ But what does sum of the variance $\sigma_x^2$ of $x$ and the variance $\sigma_y^2$ of $y$ means?
Edited: My 2nd attempt: Var(x + y ) = E$(x+y)^2$ - $(E(x+y)^2)$ = E($x^2$ + 2xy +$y^2$) - [($Ex)^2$ + 2ExEy + ($Ey)^2$ ] = [E$x^2$ - (E$x)^2$ ] + [E$y^2$ - (E$y)^2$ ] + 2[Exy - ExEy] = Var x + Var y + 2cov (x,y)
Now i assume(not sure is it correct) sum of the variance $\sigma_x^2$ of $x$ and the variance $\sigma_y^2$ of $y$
= var ($\sigma_x^2$ ) + var($\sigma_y^2$) = E($\sigma_x^2$ ) -[E($\sigma_x)]^2$ + E($\sigma_y^2$ ) -[E($\sigma_y)]^2$
Since var(x) = $\sigma_x^2$,
=E(E($x^2)$ - $[E(x)]^2$) - [$E^2$ ((E($x^2)$ - $[E(x)]^2$)) + E(E($y^2)$ - $[E(y)]^2$) - [$E^2$ ((E($y^2)$
I did not further solve it because i realise is wrong as the above does not have an expression that has an xy expression. So where have i done wrong?
The variance of $X$ is $E\left[\big(X - E[X]\big)^2\right]$ so here we are looking at
$$E\left[\big((X+Y) - E[X+Y]\big)^2\right] \\= E\left[\big((X-E[X])+(Y-E[Y])\big)^2\right] \\ = E\left[\big(X - E[X]\big)^2\right] +E\left[\big(Y - E[Y]\big)^2\right] +2E\left[\big(X - E[X]\big)\big(Y - E[Y]\big)\right] $$
You actually want the result to be the two left terms. The right-hand term cancels out if and only if the covariance is $0$, which will happen when $X$ and $Y$ are independent or in a few other special cases