Find the constrained least-squares estimator for a multiple regression model

674 Views Asked by At

Consider the multiple regression model

$$Y=X\beta+\epsilon$$

with the restriction that $\sum_l^n b_i=1$.

I want to find the least squares estimator of $\beta$, so I need to solve the following optimization problem

$$ min (Y-X\beta)^t(Y-X\beta) $$

$$s.t. \sum_l^n b_i=1$$

Let's set

$$L=(Y-X\beta)^t(Y-X\beta)-\lambda(U^t\beta-1)=Y^tY+\beta^tX^tX\beta+-2\beta^tX^tY-\lambda(U^t\beta-1)$$ where U is a dummy vector of ones (and therefore $U^T\beta=\sum_l^n b_i$).

Take derivatives

$\frac{d}{d\beta}=2X^tX\beta-2X^tY-\lambda U^t=0$

$\frac{d}{d\lambda}=U^t\beta-1=0$

So from the first equation we can get an expression for $\beta$, but what should I do with the $\lambda$? The second equation doesn't seem to be useful to get rid of it.

1

There are 1 best solutions below

0
On

First, i think in the first derivative the $U^t$ should be corrected by $U$.

The two derivatives can be written as a linear systems of equations as:

\begin{equation} \left[ \begin{matrix} 2X^tX& -1\\ 1 & 0 \end{matrix} \right] \left[ \begin{matrix} \beta\\ \lambda \end{matrix} \right]=\left[ \begin{matrix} 2X^tY\\ 1 \end{matrix} \right]. \end{equation}

Solving these equations could lead to the solutions.