Least squares constrained by fixing the average of the predictions

46 Views Asked by At

I am unsure whether this question is more suited for mathSE or CrossValidated, but since I am a mathematician I will post it here.

Are there any well-known methods to perform a least square estimation with a constraint on the mean of the estimates?

Namely, suppose we have observations $x_i\in\mathbb{R}^n$ and responses $y_i\in \mathbb{R}$, I would like to find a linear model $$\hat y_i = \beta_0+\beta^Tx_i$$ minimizing square errors $$RSS = \sum_{i=1}^N(\hat y_i - y_i)^2$$ under the condition that $$\frac{1}{N}\sum_{i=1}^N\hat y_i = c$$ for some given $c\in\mathbb{R}$ (i.e. we want the average of the predictions to be $c$).

I have tried working out the 1-dimensional case $n=1$, but of course the solution is trivial and given by the constant model $\hat y = c$ (to see this easily, notice that we have two parameters to estimate, the constraint reduces the freedom to one single variable so that the solution is unique, and the constant model fits the bill).

Higher dimensional cases seem pretty messy to me. I've tried working in general, but with scarce results. However, I am not an expert in this kind of computations, so i was wondering if someone can do better or if this kind of model has been widely studied (I would actually expect so, as it is a pretty natural constraint to consider in some cases).