Logistic Regression: Are thresholds and non-negative constraints equivalent?

88 Views Asked by At

For interpretability, I want to restrict the weights $\beta_i$ in a logisitic regression

$$p(x) = \frac{1}{1 + e^{- \beta^Tx}} $$

to be non-negative, i.e. $\beta_i \geq 0$.

Since the negative maximum likelihood of the model is known to be convex, the problem can be solved using convex optimisation.

At first sight, this constrained optimisation should be different from fitting a standard logistic regression and then setting all the negative weights to 0. At least my implementation in Python yields different results.

But since the function I want to minimise is convex, shouldn't be the optimal solution of the constrained problem be the point where all weights that are normally negative are now just set to 0? Because this is the closest one gets to the unique minimum in non-constrained space, assuming it exists.

Where am I going wrong? Are the two approaches equivalent or not?