I came across the following notation:
$$ Err = P\{a < \theta\} \\ Err = E[1_{[a< \theta ]}] $$
where $1_{[\pi ]}$ is the indicator, and is equal to 1 if $\pi$ is true and 0 otherwise.
I don't understand why the probability of an event is equal to the expected value of the number of times the event happens.
Let $A$ be an event. $E[1_{[A]}]=0 \cdot P(A^c) + 1 \cdot P(A) = P(A)$. This occurs because when $A^c$ occurs, the value of $1_{[A]}$ is $0$ and when $A$ occurs, the value of $1_{[A]}$ is 1.