Is there a link between entropy and the loss function of logistic regression?

362 Views Asked by At

entropy is a logarithmic measure of the number of states with significant probability of being occupied:

${\displaystyle S (X)=-\sum _{i}p_{i}\log p_{i},}$

consider the case where $X \in \{0,1\}$

$P(X=1)=p, \quad P(X=0)=1-p, \quad where \quad 0\leq p \leq 1$

then, the entropy is in this form(equation_1)

$S = -plog(p) -(1-p)log(1-p)$

the loss function of logistic regression is in the form (equation_2)

$-ylog(\hat y) -(1-y)log(1-\hat y)$

equation_1 look like equation_2, is there a link between them?

1

There are 1 best solutions below

0
On BEST ANSWER

The loss function in logistic regression can be viewed as the cross entropy between two discrete random variables: see here for details.