Minimising two interdependent equations with least squares regression.

66 Views Asked by At

Originally, I had a set of points in three dimensional space that I was fitting using linear regression. So my model is

$$Y = \alpha A+ \beta B$$

where $Y = \{y_i\}$ is the dependent variable, and $A = \{a_i\}$ and $B = \{b_i\}$ are predictors, $i = 1,...,N$. However, now I would like to add another condition,

$$C = \frac{1 - \beta}{ \alpha - 1}$$

where the $ C = \{c_i\} $ are a property based on $A, B$. Note that I would like to use $\alpha, \beta$ derived from the previous equation.

If I understand my problem correctly, I'm trying to find an $\alpha$ and a $\beta$ subject to minimise these two values:

$$ \min(Y - \alpha A - \beta B) $$ $$ \min(C - \frac{1 - \beta}{ \alpha - 1}) $$

Is it possible to still use linear regression for this kind of problem? If yes, how should I go about it? If not, what tools should I be considering?

If it helps, I've been working in R, using lm and predict.lm.