Best linear prediction as a projection in a Hilbert space $L^2$

58 Views Asked by At

Consider two random variables $Y$ and $X$. In the context of the best linear prediction, if we would like to predict $Y$ given $X$ known, we derive the solution solving the following minimize problem

\begin{equation} \hbox{min}_{a,b}\,\, E[(Y - (aX + b))^2] \end{equation}

Using the first order conditions, we conclude that:

$$a =\frac{cov(X,Y)}{V(X)},\quad b= E[Y] - a E[X]$$

I would like to get the same solution in the context of the $L^2$ space as Hilbert Space. We know that $<X,Y> := E[XY]$ is a inner product. If I define the space spaned by $X, 1$ as $F$, I would like to get the same coeficients $a, b$ projecting $Y$ in $F$. In other words, if we know that

$$p_F(Y) = \frac{<Y,1>}{<1,1>}1 + \frac{<Y,X>}{<X,X>}X$$

We would have

$$a = \frac{<Y,X>}{<X,X>}, \quad b = \frac{<Y,1>}{<1,1>}$$

Or

$$a = \frac{E[YX]}{E[XX]}, \quad b = E[Y]$$.

But I can reach my goal only if $E[X]= 0$. Some ideias?

1

There are 1 best solutions below

0
On BEST ANSWER

After thinking for a long time, I was able to resolve.

First, the projection given above is correct olnly if the set $\{1,X\}$ is orthogonal. So, we have to use the Gram–Schmidt process to transform $\{1,X\}$ in a orthogonal set.

$X_1 = 1$

$X_2 = X - \frac{<X,1>}{<1,1>}1 = X - E[X]$

So,

\begin{equation} \begin{split} p_F(Y) &= \frac{<Y,X_1>}{<X_1,X_1>}X_1 + \frac{<Y,X_2>}{<X_2,X_2>}X_2 \\ &= E[Y] + \frac{E[Y (X - E[X])]}{E[(X - E[X])^2]} (X - E[X])\\ &= E[Y] + \frac{E[(Y - E[Y] + E[Y]) (X - E[X])]}{E[(X - E[X])^2]} (X - E[X])\\ &= E[Y] + \left[\frac{E[(Y - E[Y]) (X - E[X])]}{V(X)} - \frac{E[E[Y](X - E[X])]}{V(X)}\right] (X - E[X])\\ &= E[Y] + \frac{E[(Y - E[Y]) (X - E[X])]}{V(X)} (X - E[X])\\ &= E[Y] + \frac{Cov(X,Y)}{V(X)} (X - E[X])\\ &= \frac{Cov(X,Y)}{V(X)} X + E[Y] - \frac{Cov(X,Y)}{V(X)}E[X]\\ &= aX + b \end{split} \end{equation}