In the context of entropic regularization, the (negative) entropy is involved in such way (last term):
Why do we subtract $-1$? So far, I have only known of the entropy being $\sum x_{i,j}\log(x_{i,j})$.
Whats the intuition or meaning behind it to do so?
