confused about beta weights in logistic regression

31 Views Asked by At

I've recently started getting into machine learning, and I saw that making a simple classifier using logistic regression was a good place to start. I was following through this article and got to the section just before the conclusion before I got stuck.

In the article, the author uses the equations m = -(b1 / b2) and b = -(b0 / b2). How did they get those?

I know that the equation for the boundary line is σ(β1x1 + ... + βnxn+b), but when I try plugging that in with my weights it just forms (what looks like) a straight line.

Why does 1/1+e^(mx+b) work, but 1/1+e^(β1x1 + ... + βnxn+b) not?

Example of 1/1+e^(mx+b)

Example of 1/1+e^(β1x1 + ... + βnxn+b)

I'm sorry if I seem unclear. if you want a further explanation I'm happy to give it

1

There are 1 best solutions below

0
On BEST ANSWER

$$y = \beta_2x_2+\beta_1x_1+\beta_0$$

The boundary is at $y=0$,

$$0 = \beta_2x_2+\beta_1x_1+\beta_0$$

$$-\beta_2x_2 = \beta_1x_1+\beta_0$$

$$x_2 = \left( -\frac{\beta_1}{\beta_2}\right) x_1 - \left( \frac{\beta_0}{\beta_2}\right)$$

Hence $m = -\frac{\beta_1}{\beta_2}$ and the intercept is $-\frac{\beta_0}{\beta_2}$.

As for your plot, there can be multiple reasons:

  • Make sure you optimize the parameters.
  • For the independent variable, make sure the corrresponding coefficient is large.