entropy is a logarithmic measure of the number of states with significant probability of being occupied:
${\displaystyle S (X)=-\sum _{i}p_{i}\log p_{i},}$
consider the case where $X \in \{0,1\}$
$P(X=1)=p, \quad P(X=0)=1-p, \quad where \quad 0\leq p \leq 1$
then, the entropy is in this form(equation_1)
$S = -plog(p) -(1-p)log(1-p)$
the loss function of logistic regression is in the form (equation_2)
$-ylog(\hat y) -(1-y)log(1-\hat y)$
equation_1 look like equation_2, is there a link between them?
The loss function in logistic regression can be viewed as the cross entropy between two discrete random variables: see here for details.