Regarding Loss function of binary logistic regression using the sigmoid function

18 Views Asked by At

I have a the following likelihood function:

$L(w)=\frac{1}{n}\sum_{t}\log(p(y_{t}/x_{t};\omega))$

and the following probability density:

$p(y_{t} = 1/x_{t};\omega) = \sigma(w^{T}x_{t})$

$p(y_{t} = 0/x_{t};\omega) = (1 - \sigma(w^{T}x_{t}))$

so, $p(y_{t}/x_{t};\omega)$ is binary.

From what I saw in texts is that:

$L(\omega)=\sum_{t}\log(p(y_{t}/x_{t};\omega)) =$

$=\sum_{t}[y_{t}\log(\sigma(w^{T}x_{t})) + (1-y_{t})\log(1 - \sigma(w^{T}x_{t}))]$

How do the $y_{t}$ and $(1 - y_{t})$ terms went out of the $\log()$ ?

1

There are 1 best solutions below

0
On BEST ANSWER

Note that since $y_t$ is binary, we can write the density function compactly as

$$p(y_t|x_t; w) = \sigma (w^Tx_t)^{y_t}(1-\sigma (w^Tx_t))^{1-y_t}$$

Hence, if we take logarithm, we can bring down the $y_t$ and $1-y_t$.

\begin{align}\log (p(y_t|x_t; w)) &= \log (\sigma (w^Tx_t)^{y_t}(1-\sigma (w^Tx_t))^{1-y_t}) \\ &= \log (\sigma (w^Tx_t)^{y_t}) + \log(1-\sigma (w^Tx_t))^{1-y_t}) \\ &=y_t\log (\sigma (w^Tx_t)) + (1-y_t)\log(1-\sigma (w^Tx_t)))\end{align}