Multivariate Least square for estimating the matrix

44 Views Asked by At

$M\vec{x_t} + \vec{e} = \vec{y_t}$

$\vec{e}$ are errors.

Given a sufficient number of pairs of vectors $\vec{x}$ and $\vec{y}$, can a least square method estimate the entries of matrix $M$?

$\vec{x}$ only contains zeros and ones.

What is the name of the method and where can I find out how it is done?

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

Vectors are all column vectors.

Sum of squared error for one pair of $\vec{x}$ and $\vec{y}$ is:

$\vec{e}^{T}\vec{e} = (\vec{y} - M\vec{x})^{T}(\vec{y} - M\vec{x})$

$= \vec{y} \cdot \vec{y} - 2(M\vec{x})\cdot \vec{y} + (M\vec{x}) \cdot (M\vec{x})$

Derivative of error at time $t$:

$\forall j, k: {\frac{\partial{\vec{e}^{T}\vec{e}}}{\partial M_{jk}}} = -2 x_{k}y_{j} + 2[\sum_{b=1}^{K}M_{jb}x_{b}]x_{k}$

Sum derivative of error over all time and set it to zero gives the answer:

$MA = B$

$A = [\vec{a_1} ... \vec{a_k}]$

$\vec{a}_k = \sum_{t} \vec{x}_t x_{kt}$

$B_{jk} = \sum_{t} x_{kt} y_{jt}$