Understanding a likelihood as a loss function

31 Views Asked by At

Paper Link (IJCAI'18 Yang et al): http://dmkd.cs.vt.edu/papers/IJCAI18.pdf

In the following paper, the authors defined Eq.8 as the conditional check-in rate in the Recurrent-censored Regression model and the Eq. 14 is a negative likelihood of observing training instances as a loss function.

$r(t|X,U) = r_0(t)exp(H(t)) \hspace{1cm} (8) $

$\Pi_{i:\delta_i=1} \dfrac{exp(H(t_i))}{\sum_{j:t_j \geq t_i}exp(H(t_j))} \hspace{1cm} (14) $

1) Is Eq. 14 a negative likelihood? or is it a likelihood?

2) I do not understand why the denominator is summed over $j:t_j\geq t_i$.

I realized that Eq. 14 looks similar to Softmax according to this link, but I do not fully understand it.