Maximum Likelihood Estimation with degenerate functions

31 Views Asked by At

I have data which can be described by a background term, $f(x)$, and a signal term, $g(x,\theta)$. In this particular case, $f(x) = x^a$ is a single power law whose index is known, and $g(x) = A_0 \left( \frac{x}{x_b} \right)^{\gamma_1} \left[ \frac{1}{2} \left(1+ \left( \frac{x}{x_b} \right) ^{\frac{1}{\Lambda}}\right) \right]^{(\gamma_2-\gamma_1)\Lambda}$ is a broken power law with normalisation $A_0$, inner and outer power law indices $\gamma_1$ and $\gamma_2$, break radius $x_b$ and smoothing paramater $\Lambda$.

The issue is that the broken power law can take the form of a single power law with index $a$ by setting $\gamma_1 = \gamma_2 = a$. In cases where no signal is present, the MLE routine is finding that the data is best described by a case with no background term and a signal term acting as a single power law with index $a$, i.e. it is using the signal term to fit the background term and assuming the data consists entirely of "signal" with no background, when in fact the opposite is true. Is there any way to modify my likelihood to prevent this from occurring?