How to perform a two-way linear regression?

48 Views Asked by At

Given the dataset $\{(X_1, y_1), (X_2, y_2), ..., (X_n, y_n)\}$, where $X_i$ are matrices of identical size, and $y_i$ are scalers, propose the following two-way linear regression scheme: $$ \hat{y_i} = u^{T} X_i v, $$ where $u$ and $v$ are vectors of appropriate sizes. Errors are defined in least square, so $$ \min_{u,v} \sum_{i} \left(y_i - \hat{y_i}\right)^2. $$

  1. What is the general, linear algebra solution for such regression, if there is one?
  2. Suggestions on solving this numerically by computer with the extra constraints that $||u|| = 1$ and all entry in $v$ is nonnegative, given that all $y_i$ is nonnegative?
1

There are 1 best solutions below

3
On BEST ANSWER

An interesting problem. What is motivation for such a regression? In any case, even at point 1. the problem is not linear. The minimized function is given by \begin{align} f:=\frac{1}{2}\sum_{i=1}^n\left(y_i-\sum_{k,l}u_kX_{i,kl}v_l\right)^2 \end{align} and necessary first order conditions have the form: \begin{align} \frac{\partial f}{\partial u_K}=\sum_{i=1}^n\left[\left(y_i-\sum_{k,l}u_kX_{i,kl}v_l\right)\sum_lX_{i,Kl}v_l\right]=0,\\ \frac{\partial f}{\partial v_L}=\sum_{i=1}^n\left[\left(y_i-\sum_{k,l}u_kX_{i,kl}v_l\right)\sum_lX_{i,kL}u_k\right]=0, \end{align} i.e. you get a set of quadratic equations for which you need a nonlinear solver. At point 2., in order to keep non-negativity of $u$ and $v$, you need even a nonlinear optimizer, which is able to handle inequalities.