Solve Linear Linear Regression with Linear Inequality Constraints for $ y = -a x + b $ Model

81 Views Asked by At

Need to find some formula how to calculate a and b of some equation if a < 0

y= - ax + b

What is the easiest way to draw a line when i have a < 0?

And my Dataset is something like that:

9 1
8 2
7 3
6 4
5 5
4 6
3 7
2 8
1 9
1

There are 1 best solutions below

0
On

Your problem is basically:

$$\begin{align*} \arg \min_{ \theta } \quad & \frac{1}{2} \left\| M \theta - y \right\|_{2}^{2} \\ \text{subject to} \quad & {z}^{T} \theta \leq 0 \end{align*}$$

Where $ \theta \in \mathbb{R}^{2} $, namely $ {\theta}_{1} = a $, $ {\theta}_{1} = b $ and $ z = {\left[ 1, 0 \right]}^{T} $.

This is a simple Least Squares with Linear Inequality Constraints.
You'll be able to solve it using many methods (Even Projected Gradient Descent will do or any MATLAB Built In solver).

Remark
Pay attention that in order to make it convex optimization problem I set the constraint to be weak inequality.
The model will chose $ {\theta}_{1} = 0 $ only in case a constant value function (DC) performs better than Linear Function.