Define the following function:
$$L(g,\theta) = \begin{cases} 0 & \text{when } g -\epsilon < \theta < g + \epsilon \\ 1 & \text{otherwise} \end{cases} $$
where $g = g(\bar{x})$ for a random sample $\bar{x}$
I am trying to determine how the solutions have found that
$$\mathbb{E} (L(g,\theta)) =1 - \int_{g-\epsilon}^{g+\epsilon} f(\theta | \bar{x}) \ dx $$
Just really, where the minus comes in?
Define the function $$M(g,\theta) = \begin{cases} 1 & \text{when } g -\epsilon < \theta < g + \epsilon \\ 0 & \text{otherwise} \end{cases} $$
Then $M+L=1$. By linearity of the expectation operator, $$E(L) + E(M) = 1$$ and thus $$E(L) = 1 - E(M) = \mathbb{E} (L(g,\theta)) =1 - \int_{g-\epsilon}^{g+\epsilon} f(\theta | \bar{x}) \ dx$$