What does orthogonal random variables mean?

17.7k Views Asked by At

As far as I know orthogonality is a linear algebraic concept, where for a 2D or 3D case if the vectors are perpendicular we say they are orthogonal. Even it is OK for higher dimensions. But when it comes to random variables I cannot figure out orthogonality. I saw that somewhere if the expectation of 2 random variables $X$ and $Y$ is zero ( $E[XY] = 0$ ) then the random variables are orthogonal. How is that possible?

Is orthogonality in linear algebra and probability and statistics same?

2

There are 2 best solutions below

9
On BEST ANSWER

Orthogonality comes from the idea of vanishing inner product. In case of random variables $$ \mathbb E \left [ X\right ] = \int_{-\infty}^\infty xd\mu_X $$ so, orthogonal RVs are those with $$ \mathbb E \left [ XY\right ] = \int_{-\infty}^\infty \int_{-\infty}^\infty xy d\mu_X d\mu_Y = 0 $$

0
On

If $\langle X, Y \rangle$ = 0, then we say $X$ and $Y$ are orthogonal, where $X, Y$ are vectors in an inner product space with inner product $\langle \cdot, \cdot \rangle$.

Now, let $X, Y$ denote two random variables. Suppose $\langle X, Y \rangle = Cov(X,Y),$ where the latter denotes the covariance of $X$ and $Y.$ Then, one can verify that this is indeed an inner product (check the four properties of an inner product).

But, we also know that $Cov(X,Y) = \mathbb{E} [XY] - \mathbb{E} [X]\mathbb{E} [Y],$ so we have that $$\langle X, Y \rangle = \mathbb{E} [XY] - \mathbb{E} [X]\mathbb{E} [Y].$$ If $X$ and $Y$ are independent, as the term is used in probability theory, then $\mathbb{E} [XY] = \mathbb{E} [X]\mathbb{E} [Y],$ so $$\langle X, Y \rangle = \mathbb{E} [XY] - \mathbb{E} [X]\mathbb{E} [Y] = \mathbb{E} [X]\mathbb{E} [Y] - \mathbb{E} [X]\mathbb{E} [Y] = 0.$$ Therefore, $X$ and $Y$ are orthogonal, as the term is used in linear algebra.