Proof that $\rho = \pm1$ implies linear combination

781 Views Asked by At

I need to prove that $$\rho(X,Y) = \pm 1 \implies Y = aX+b,$$ for some constants $a, b$.

With the help of this thread and this document here's what I've got so far:

Let $X, Y$ be random variables and $a$ some constant. $aX + Y$ is also a random variable whose variance by definition is non-negative $$V(aX+Y) \ge 0.$$ From variance properties: $$a^2 V(X)+2a\text{Cov}{X,Y}+V(Y)\ge 0$$ This is a quadratic as a function of $a$, which has a maximum of 1 root. It has a root only for $a$ that satisfies $$a^2 V(X)+2a\text{Cov}{X,Y}+V(Y)=0$$ let it be $a_0$. That means that $$V(a_0X+Y)=0$$

How can I continue from here?

Thank you.

2

There are 2 best solutions below

5
On

Hint: In which cases do we have equality in the Cauchy-Schwarz inequality?

Further hint: Here, the Cauchy-Schwarz inequality is used in the form $$\operatorname{Cov}(X,Y)^2\le\operatorname{Var}(X) \operatorname{Var}(Y)$$

8
On

Here is a slightly different proof. Let $X':=X/\sigma_X$ and $Y':=Y/\sigma_Y$. Then \begin{align} &\operatorname{Var}(X'+Y')=2(1+\rho_{X,Y}) \quad\text{and} \\ &\operatorname{Var}(X'-Y')=2(1-\rho_{X,Y}). \end{align} If $\rho_{X,Y}=1$, then $\operatorname{Var}(X'-Y')=0 \Leftrightarrow X'-Y'=\mathsf{E}X'-\mathsf{E}Y'$ a.s., which implies that $$ Y=aX+\left(\mathsf{E}Y-a\mathsf{E}X\right) \quad\text{a.s.}, $$ where $a\equiv\sigma_Y/\sigma_X$. Similarly, when $\rho_{X,Y}=-1$, $$ Y=-aX+\left(\mathsf{E}Y+a\mathsf{E}X\right) \quad\text{a.s.} $$