Given a random variable $X$ taking values in $\{0,1,\ldots,n\}$, we define the $r$th factorial moment by
$$\mathbb{E}_r[X] := \mathbb{E}[X(X-1)\cdots(X-r+1)]$$
By explicit computation, one can easily show that
$$P(X=0) = \sum_{r=1}^n (-1)^r \frac{\mathbb{E}_r[X]}{r!}$$
In Probabilistic Combinatorics and Applications, by Béla Bollobás, Fan R. K. Chung, it is claimed that this satisfies the alternating inequalities, i.e $$ \sum_{r=1}^m (-1)^r \frac{\mathbb{E}_r[X]}{r!} \ : \ P(X=0)$$
where $:$ is $>$ if $m$ is even, and $<$ if $m$ is odd. This apparently follows from the inclusion exclusion principal. I have tried to show this, writing X as a sum of indicator variables, but it gets quite messy. Could someone sketch the proof, just outlining any tricks that may be used?
Note that $E_r[X]/r!=E[\binom{X}r]$, so $$ \sum_{r=0}^m\frac{(-1)^rE_r[X]}{r!}=E\left[\sum_{r=0}^m(-1)^r\binom{X}r\right]=E\left[(-1)^m\binom{X-1}{m}\right] $$ For a nice proof of the second equality, see the answers at Bonferroni Inequalities. Notice that $$ (-1)^m\binom{X-1}m= \begin{cases} 1 & X=0\\ 0 & 0<X\le m\\ (-1)^m\times\text{a positive number} & m<X \end{cases} $$ For $X=0$, we are using the convention $\binom{-1}m=\frac{(-1)(-2)\cdots (-m)}{m!}=(-1)^m$. If you are unfomfortable with this, you can directly check that $\sum_{r=0}^m(-1)^r\binom{X}r=1$ when $X=0$.
Therefore, $(-1)^m\binom{X-1}m$ is either an overestimator or an underestimator for the indicator variable $1(X=0)$, depending on the parity of $m$. Conclude by taking expectations.