could not find any information on the problem I face or even the name for this whatsoever.
I hvae $K$ equations
$$ \mathbf{y} = \mathbf{X}_1 \beta_1 + \mathbf{X}_0\mathbf{\gamma} $$ $$ \vdots $$ $$ \mathbf{y} = \mathbf{X}_k \beta_k + \mathbf{X}_0\mathbf{\gamma} $$
As you can see all of the equations share the same $\mathbf{y}$ and $\mathbf{X}_0\mathbf{\gamma}$. Also any of $\mathbf{X}_k$ and $\mathbf{X}_0$ might have a higher dimension than 1 (but the same number of rows).
I'd like to find a solution to this overdetermined system (i.e. find parameters $\mathbf{\beta_k}$ and $\mathbf{\gamma}$) that will minimize sum of squares of all of these equations. If not the common terms I'd simply use standard method for OLS for every equation separately.
Certainly, I could find some solution with brute force optimization but I wonder, is there any analytical solution?
I am not sure if the context is relevant here but I want to write an expectation-maximization algorithm for mixture of gaussian regressions where some of the terms are known to be the same among these regressions.
Let k=2 so we are working with just $y\ x_0\ x_1$ and $x_2$
The left-hand side becomes
$$ \begin{bmatrix} y^1\\ y^2\\ \vdots\\ y^n\\ \end{bmatrix} $$ $$ \begin{bmatrix} y^1\\ y^2\\ \vdots\\ y^n\\ \end{bmatrix} $$ $$ \begin{bmatrix} y^1\\ y^2\\ \vdots\\ y^n\\ \end{bmatrix} $$
and the right-hand side becomes $X \beta$ where $X$ is $$ \begin{bmatrix} \begin{bmatrix} x_0^1\\ x_0^2\\ \vdots\\ x_0^n\\ \end{bmatrix} \begin{bmatrix} x_1^1\\ x_1^2\\ \vdots\\ x_1^n\\ \end{bmatrix} \begin{bmatrix} 0\\ 0\\ \vdots\\ 0\\ \end{bmatrix}\\ \begin{bmatrix} x_0^1\\ x_0^2\\ \vdots\\ x_0^n\\ \end{bmatrix} \begin{bmatrix} 0\\ 0\\ \vdots\\ 0\\ \end{bmatrix} \begin{bmatrix} x_2^1\\ x_2^2\\ \vdots\\ x_2^n\\ \end{bmatrix} \end{bmatrix} $$
and beta is $$ \begin{bmatrix} \gamma\\ \beta_1\\ \beta_2\\ \end{bmatrix} $$
Then solve as usual (premultiply both sides by $X^T$ then premultiply by the inverse of $X^TX$ etc.)