Why do we subtract $1$ in this definition of the shannon entropy?

99 Views Asked by At

In the context of entropic regularization, the (negative) entropy is involved in such way (last term):

enter image description here

Why do we subtract $-1$? So far, I have only known of the entropy being $\sum x_{i,j}\log(x_{i,j})$.

Whats the intuition or meaning behind it to do so?