Combining cross - entropy functions

67 Views Asked by At

I have the following two cross-entropy loss functions: $$x - x z + \log(1 + \exp(-x))$$ and for $x<0$ $$-x z + \log(1 + \exp(x))$$

where, $z$ is a value between $[0,1]$ and $x$ is a logit. And, both of the above equations have range $[0, \infty]$ Now, the combined formula is $$\max(x, 0) - x z + \log(1 + \exp(-|x|))$$

However, I'm not quite sure about why there is an absolute value inside the exp.

The actual math is here from tensorflow.

1

There are 1 best solutions below

1
On BEST ANSWER

Work it out. When $x>0$, $|x|=x$, you immediately recover the first formula. If $x<0$, then you get $|x|=-x$, giving:

$$0-xz+\log(1+\exp(x))$$