How is the cost function $ J(\theta)$ always non-negative for logistic regression?

5.4k Views Asked by At

I am studying Logistic Regression from Andrew Ng's Machine Learning Course. A quiz in the course stated that

The cost function J(θ) for logistic regression trained with m≥1 examples is always greater than or equal to zero.

which apparently is correct. But I am unable to understand how, when there is clearly a negative sign in the cost function described.

\begin{align*}& J(\theta) = \dfrac{1}{m} \sum_{i=1}^m \mathrm{Cost}(h_\theta(x^{(i)}),y^{(i)}) \newline & \mathrm{Cost}(h_\theta(x),y) = -\log(h_\theta(x)) \; & \text{if y = 1} \newline & \mathrm{Cost}(h_\theta(x),y) = -\log(1-h_\theta(x)) \; & \text{if y = 0}\end{align*}

1

There are 1 best solutions below

2
On BEST ANSWER

You get $\ln(h)<0$ for $h\in(0,1)$. Thus $-\ln(h)>0$ for this interval.