I have an easy question about indicator functions.
Let $A$ be any event. We can write $\Bbb P(A)$ as an expectation, as follows:
Define the indicator function:
$$ I_A = \begin{cases} 1, & \text{if event $A$ occurs} \\ 0, & \text{otherwise} \end{cases} $$
Then $I_A$ is a random variable. Why the first and last euality holds here:
$$ \Bbb E(I_A) = \sum_{r=0}^1 r \cdot \Bbb P(I_A = r) \\ = \Bbb P(A). $$
?
In general if $X$ is a random variable with $\mathsf P(X\in S)=1$ where $S$ denotes a countable set then: $$\mathsf EX=\sum_{s\in S}s\mathsf P(X=s)$$
Observe that $\mathsf P(1_A\in\{0,1\})=1$ and apply that to find the first equality.
$\mathsf P(X=1)$ is just an abbreviation of $\mathsf P(\{\omega\in\Omega\mid X(\omega)=1\})$.
For $X=1_A$ we find: $$\{\omega\in\Omega\mid X(\omega)=1\}=\{\omega\in\Omega\mid 1_A(\omega)=1\}=A$$ so that: $$\mathsf P(1_A=1)=\mathsf P(A)$$