I'm looking at the following situation, where the quantity $|E'|$ is estimated, given parameters $\epsilon, \delta_1, n,p, \text{and } r$.
- $\mathbb{E}\left[\left|E^{\prime}\right|\right]=e(H) p=\left(1 \pm \delta_{1}\right) \frac{\varepsilon n}{r}$
- $\operatorname{Var}\left(\left|E^{\prime}\right|\right)=e(H) p(1-p) \leq \mathbb{E}\left[\left|E^{\prime}\right|\right]=o\left(\mathbb{E}\left[\left|E^{\prime}\right|\right]^{2}\right)$
- $\therefore$ Chebyshev $\Rightarrow$ with high probability, $\left|E^{\prime}\right|=\left(1 \pm 2 \delta_{1}\right) \frac{\varepsilon n}{r}$
Can I have an elaboration of the last claim?
Edit: the notations are used in the context of a graph-theoretical lemma:

We want to show that, with high probability, $|E^{\prime}| = (1 \pm 2 \delta_{1}) \frac{\varepsilon n}{r}$. Equivalently, $$\left||E^{\prime}| - \frac{\varepsilon n}{r} \right| \leq \frac{2\delta_1 \varepsilon n}{r}. \tag{1}\label{eq:1}$$ Because $\mathbb{E}\bigl[|E^{\prime}|\bigr] = (1 \pm \delta_{1}) \frac{\varepsilon n}{r}$, it is enough to show that $$\bigl||E^{\prime}| - \mathbb{E}\bigl[|E^{\prime}|\bigr] \bigr| \leq \frac{\delta_1 \varepsilon n}{r}.$$
We are given that $\operatorname{Var}\bigl(|E^{\prime}|\bigr) = o\Bigl(\mathbb{E}\bigl[|E^{\prime}|\bigr]^{2}\Bigr) = o(n^2)$. So, by Chebyshev's Inequality, $$\mathbb{P}\left(\bigl||E^{\prime}| - \mathbb{E}\bigl[|E^{\prime}|\bigr] \bigr| > \frac{\delta_1 \varepsilon n}{r}\right) \leq \frac{\operatorname{Var}\bigl(|E^{\prime}|\bigr)}{(\delta_1 \varepsilon n / r)^2} = o(1).$$ Thus, \eqref{eq:1} holds with high probability, as claimed.