Problem with linear regression normal equations when a coefficient should be zero

28 Views Asked by At

I am having trouble using the linear regression normal equations to minimize a function of several variables. The minimization solution is :

$$\beta = (X^TX)^{-1}X^TY$$ where $\beta$ is the coefficient vector of a function to minimize : $$f(x,y)=\beta_1g_1(k_1,x_1,y_1)...\beta_ng_n(k_n,x_1,y_1)$$ $g$ is a function of $k,A \text{ and } B$.

The $k$'s are parameters that I set manually.

$x_i$ and $y_i$ are $i$ measurements taken at points $i\in {1,...,m}$

and The matrix $X$ is

$$X=\begin{bmatrix} g_1(k_1,x_1,y_1) & \dots & g_n(k_n,x_1,y_1) \\ \vdots & & \vdots \\ g_1(k_1,x_m,y_m) & \dots & g_n(k_n,x_m,y_m) \end{bmatrix}$$ and $Y=[Y_{true}(x_1,y_1),...Y_{true}(x_m,y_m)]$, which is a vector of true experimental results at m measurements.

This all works nicely, except when one of the $K=0$. Ideally if I set one of the $K=0$, the corresponding $\beta$ should also equal zero. Instead, setting a $K$ to zero ends up making all of the $\beta$ coefficients equal to zero. Why is this, and is there a work around?