Comparison of two least-squares optimization problems

61 Views Asked by At

I have come across two least square minimization problems. The first one is:

$$\min_{\beta\in \mathbb{R}} \lvert y_j-x_j\beta\rvert, \quad \text{where}\ j = 1, \dots, n.$$

Here $y$ is the dependent random variable and $x$ is the independent random variable.

What does $\beta$ mean when $x_{j} = 1$?

The second minimization problem is:

$$\min_{\beta\in \mathbb{R}}\lvert y_j - x_j\beta \rvert ^2.$$

What's the difference between these two minimization problems? Which one is a better option?