I am reading the book A Probabilistic Theory of Pattern Recognition and I am trying to understand the following transformation on page 13:
where $T,B$ and $E$ are i.i.d. random variables with probabiltiy density $e^{-x}$ on $[0,\infty)$, i.e
$$ P[T \leq x] = P[E \leq x] = P[B \leq x] = \int_0^x e^{-u} \, du $$
First of all, I wonder of the stated equation is actually true, because I have the feeling it should be
$$ \begin{align*} & \quad P\{T+B < 7 - log 2, T+B+E \geq 7 \} \\ & \quad + P\{T+B \geq 7-log2, T+B+E < 7 \} \\ &= E\{\color{red}{(1-e^{-(7-T-B)})}I_{\{T+B<T-log 2\}}\} \\ & \quad+ \color{red}{E}\{\color{red}{e^{-(7-T-B)}}I_{\{7>T+B \geq 7-log 2\}}\} \end{align*}$$
I would like to know how he made the transformation. Is it maybe in general true that if $P[X \leq x] = \int_0^x \mu(x) \, dx$ holds, that one can say $$ P[X \leq f(Z)] = E[\mu(f(Z))]?$$
