Correlation of $X+Y$ and $X-Y$

7.1k Views Asked by At

Suppose that $X$ and $Y$ are random variables with the same variance. Show that $X-Y$ and $X+Y$ are uncorrelated.

The solution is below, however, I am confused about a portion of the explanation.

Solution:
Because the covariance remains unchanged when we add a constant to a random variable, we can assume without loss of generality that $X$ and $Y$ have zero mean. We then have $$\operatorname{cov}(X-Y, X+Y ) = \operatorname{E}[(X-Y )(X + Y )] = \operatorname{E}[X^2] -E[Y^2] = \operatorname{var}(X) - \operatorname{var}(Y ) = 0$$

since $X$ and $Y$ were assumed to have the same variance.

My question:
I cannot tie the relevance of the phrase "Because the covariance remains unchanged when we add a constant to a random variable" to this question. An explanation would be appreciated.

4

There are 4 best solutions below

1
On BEST ANSWER

$\newcommand{\cov}{\operatorname{cov}}$Sometimes I'm amazed at how complicated people make their posted answers here. \begin{align} & \cov(X-Y,X+Y) \\[12pt] = {} & \operatorname{cov}(X,X+Y) - \cov(Y,X+Y) & & \text{because cov is linear} \\[-6pt] & & & \text{in the first argument} \\[12pt] = {} & \Big(\cov(X,X) + \cov(X,Y) \Big) - \Big( \cov(Y,X) + \cov(Y,Y)\Big) & & \text{because cov is linear} \\[-9pt] & & & \text{in the second argument} \\[12pt] = {} & \operatorname{var}(X) + \cov(X,Y) - \cov(Y,X) - \operatorname{var}(Y) = \cdots\cdots \end{align}

2
On

Note that $\operatorname{cov} (A,B) = E[(A-EA)(B-EB)]$.

If $c$ is a constant, we have $Ec = c$ hence for any constants $a,b$ we have

\begin{eqnarray} \operatorname{cov} (A+a,B+b) &=& E[((A+a)-E(A+a))((B+b)-E(B+b)) \\ &=& E[(A-EA)(B-EB)] \\ &=& \operatorname{cov} (A,B) \end{eqnarray}

In the case of the question, we would choose $a=-EA, b=-EB$, hence we can just assume that $A,B$ have zero mean to start with.

0
On
  • The statement tells us that $\text{cov}(X+a,Y+b)=\text{cov}(X,Y)$ for any $a,b$. This means that if we take $a=-\mathbb{E}[X], b=-\mathbb{E}[Y]$, that $$\text{cov}(X-Y,X+Y)=\text{cov}(X'-Y',X'+Y'),$$ where $X',Y'$ are the centred (mean-$0$) versions of $X',Y'$.
  • A random variable $X$ has mean $0 \iff \text{var}X = \mathbb{E}[X^2]$, which is what allows the manipulation "$\mathbb{E}[X^2]-\mathbb{E}[Y^2] = \text{var}X - \text{var}Y$". This is why the assumption of $X,Y$ being $0$-mean is handy.
0
On

Without making this assumption, the covariance is $$ cov(X+Y,X-Y) = E((X+Y)(X-Y)) -E(X+Y)E(X-Y).$$ When you assume they have mean zero $E(X+Y) = E(X)+E(Y) = 0$ (and similarly $E(X-Y)=0)$ so the covariance becomes $E(X^2)-E(Y^2).$ And then, since $E(X)=E(Y)$ we have $E(X^2) = var(X)$ and $E(Y^2) = var(Y).$

It's not too bad if you don't use this trick though. You can just write $$ E((X+Y)(X-Y)) -E(X+Y)E(X-Y) = E(X^2)-E(Y^2) - (E(X)^2-E(Y)^2)\\= (E(X^2)-E(X)^2) - (E(Y^2)-E(Y)^2) = var(X)-var(Y).$$