Correlated multivariate linear least squares

159 Views Asked by At

Linear Least Squares solves $y = X\beta + \varepsilon$ for $\beta$ when $y, \beta$ are vectors of size $n$. and $\operatorname{Var}[\,\varepsilon \mid X\,] = \sigma^2 I_n$ (spherical errors). I have multivariate measurements, meaning that $y$ is a matrix, but also, the different variates of each measurement are correlated and I have a known covariance matrix for each measurement (also means that each measurement have a different covariance). So, obviously, the spherical errors assumption of ordinary least squares doesn't hold. Is there some extension for least squares that allows to solve such problem?

clarification update: In my problem, every row of $y$, $y_i$, is a column vector $\begin{pmatrix}y_{i1},y_{i2},y_{i3}\end{pmatrix}$ which is a vector of random variables, with a known (not diagonal) covariance. That's a 3d location measurement for input time $X_i$. (every row in $X$ is 1,time, and time squared).

1

There are 1 best solutions below

1
On

I had time to come back to that problem today and i think i solved it:

Instead of $y$ being a matrix as in General Linear Model, I look at every measurement independently, so it's a column vector. so, as in Generalized Linear Squares we have: $$\hat\beta = \underset{b}{\rm arg\,min}\,(y_i-X_ib)^t\,P^{-1}_i(y_i-X_ib)$$ for the $i$ measurement, where $P_i$ is the covariance matrix. But we want to minimize across all the measurements. We should use the sum for that (because we try to minimize sum of squares), so our problem is to find: $$\hat\beta = \underset{b}{\rm arg\,min}\,\sum_i(y_i-X_ib)^t\,P^{-1}_i(y_i-X_ib)$$ If we follow the proof to Generalized Least Squares, which gives the optimal solution: $$\hat\beta = (X^tP^{-1}X)^{-1} X^tP^{-1}y$$ it's easy to see that with the sum we'll get (it's just sums of matrix derivatives): $$\hat\beta = (\sum_iX_i^tP_i^{-1}X_i)^{-1} \sum_iX_i^tP_i^{-1}y_i$$