Least square representation using eigenvector of $E[xx^T]$

41 Views Asked by At

The question in the context of random design. E represents expectation.

Define $\beta$ such that $E[(\langle\beta, x\rangle - y)^2] = \min E[(\langle w, x\rangle - y)^2]$

Then we can get $\beta:= \sum \beta_j v_j$, where $v_j$ are the engenvectors of the matrix $\Sigma := E[xx^T]$. and $\beta_j = \frac{E[<v_j,x>y]}{E[<v_j, x>^2]}$.

I tried to solve $0 = \frac{d}{dw}E[(\langle w, x \rangle - y)^2]$, and get the minimizer is $\beta = (E[x^Tx])^{-1}E[yx]$. But how to related this expression with the eigenvector of $\Sigma$?

1

There are 1 best solutions below

0
On

Let $V\Lambda V^\top$ be an eigendecomposition of $\Sigma$. The orthonormal columns $v_j$ of $V$ are the eigenvectors, and the diagonal elements $\lambda_j$ of $\Lambda$ are the corresponding eigenvalues. Note that $\lambda_j=v_j^\top\Sigma v_j=E[v_j^\top xx^\top v_j]$.

Now substitute $\Sigma^{-1}=V\Lambda^{-1}V^\top$ into $\beta$ and rearrange to get the result.