Usually linear regression involves two variables $(x,y)$, i.e. an independent variable $x$ and a dependent variable $y$, and they are related by the following expression \begin{equation} y = a_0 + a_1 x, \end{equation} where $a_0$ and $a_1$ are parameters that define the linear model. In linear regression we have one equation of this form for each couple of observed variables $(x_i,y_i)$, thus we have a linear system and its solution gives us $a_0$ and $a_1$.
Let's consider that we have two set of independent-dependent variables, namely $(x,y)$ and $(w,z)$. The first two variables $(x,y)$ are related by the previous equation, while the second two variables $(w,z)$ are related by the following \begin{equation} z = b_0 + b_1 w, \end{equation} where $b_0$ and $b_1$ are parameters that define the linear relation between $z$ and $w$. Also in this case a set of observation $(w_j,z_j)$ leads to a linear system and its solution gives us $b_0$ and $b_1$.
In general, if $a_0$, $a_1$, $b_0$ and $b_1$ are independent, then we can solve the two linear systems separately. But now, let's suppose that $a_0$ and $b_0$ are independent, while $a_1=b_1$. In this case, the two linear systems should be solved simultaneously.
I've solved this problem just definying one linear system of equation involving both the two sets of equations, but I would like to know if this problem has a specific name and how to correctly approach it. In particular, I want to know how to assessing the fit quality (for example, with an equivalent of the $R^2$).
Thank you!
If your model is $$y=a_0+c_1x,\\z=b_0+c_1w$$ you can minimize
$$\sum(a_0+c_1x-y)^2+\sum(b_0+c_1w-z)^2,$$
giving the equations
$$\sum a_0+c_1x-y=0,\\\sum b_0+c_1w-z=0,\\\sum x(a_0+c_1x-y)+\sum w(b_0+c_1w-z)=0.$$
Now solve this $3\times3$ system for $a_0,b_0,c_1$.
The fit quality is still given by the ratio of the explained variance over the total variance.