Simplifying a likelihood function

324 Views Asked by At

I'm trying to simplify following equation:

$\log L(\theta|M)=\sum_{d=1}^D\log\bigg((1-\alpha)(\exp(-\epsilon_b)\frac{\epsilon_b^B}{B!}\exp(-\epsilon_s)\frac{\epsilon_s^S}{S!}+\alpha(1-\delta)(\exp(-(\epsilon_b+\mu))\frac{(\epsilon_b+\mu)^B}{B!}\exp(-\epsilon)\frac{\epsilon_s^S}{S!})+\alpha\delta(\exp(-\epsilon_b)\frac{\epsilon_b^B}{B!}\exp(-(\epsilon_s+\mu))\frac{(\epsilon_s+\mu)^S}{S!})\bigg) $

They have dropped the constant term $−\log(Bd!Sd!)$ and simplified the equation into following factorization:

$\log L(\theta|M)=\sum_{d=1}^D\bigg(-\epsilon_b-\epsilon_s+M_d(\log(x_b)+\log(x_s))+B*\log(\mu+\epsilon_b)+S*log(\mu+\epsilon_s)\bigg)+\sum_{d=1}^D\log\bigg((1-\alpha)x_s^{S-M} x_b^{B-M}+\alpha(1-\delta) \exp(-\mu)x_s^{S-M}x_b^{-M}+\alpha\delta\exp(-\mu) x_b^{B-M}x_s^{-M}\bigg)$

where $M_d=\min(B,S)+\frac{max(B,S)}{2}, x_s=\frac{\epsilon_s}{\epsilon_s+\mu} $ and $x_b=\frac{\epsilon_b}{\epsilon_b+\mu}$.

Unfortunately I do not have a clue how the simplification happened. I would be really thankful and more than happy if somebody could describe the simplification step by step.

If there is anything in addition you need to know, please do not hesitate to ask me.

You can find the whole article which mentions the simplification in the following page:

https://cran.r-project.org/web/packages/pinbasic/vignettes/pinbasicVignette.html#general_pin_framework

Thank you for your help!