Let $x$ be a random bit string that takes values $\{1,0\}^n$. Let $r$ be the value of the most significant bit (MSB) of $x$ (and $r$ is a r.v. $1$ or $0$ that are equally likely). Let $g$ be our guess for the MSB of $x$. The guess $g$ is a r.v. representing that we somehow guess that value of $r$ for the bit string $x$. The method of guessing is left unspecified but we know some probability facts about it.
It is given that we guess the value of $r$ with the following probability: $$\Pr[r=g] > \frac12 +\alpha.$$ Let $p_0$ be the probability that we guess it to be $1$ (i.e., $g=1$) given $r$ is $0$, and $p_1$ be the probability that we guess it correctly given $r$ is $1$: $$p_0 = \Pr[g=1|r=0]$$ $$p_1 = \Pr[g=1|r=1]$$
Fact $1$: From the previous part we know: $$\frac12 +\alpha < \Pr[r=g] = \frac12(1-p_0) +\frac12p_1.$$ Some rearranging gets us: $$p_1-p_0>2\alpha$$
Let $\hat p_0$ and $\hat p_1$ be our estimates of $p_0$ and $p_1$ from sampling some bit strings from $\{0,1\}$. Let $r_i$ be the value of the MSB that the string $x$ had in the $i$th sample. Then for that $i$th sample, we will make a guess for it and denote that guess $g_i$.
Do $m$ samples and then estimate those probabilities as follow:
$$\hat p_0 = \frac1m \sum^m_{i=1} g_i \space \text{ for random bit string $x$ such that } r_i = 0 $$
$$\hat p_1 = \frac1m \sum^m_{i=1} g_i \space \text{ for random bit string $x$ such that } r_i = 1 $$
Fact $2$ (to prove): Show that for $m>4(\frac\alpha2)^2 (\frac\delta2)$,
$$\Pr \left[|p_0-\hat p_0|<\frac\alpha2\right] > 1 -\frac\delta2$$ $$\Pr \left[|p_1-\hat p_1|<\frac\alpha2\right] > 1 -\frac\delta2$$
[Hint $1$: Law of Large Numbers (LLN) or Chebyshev's inequality]
[Hint $2$: the variance of a r.v. in the range from $0$ to $1$ is upper-bounded by $1/4$]
I have tried multiple things; however, I am having troubles applying Chebyshev's or LLN. The thing that confuses me about the hints is that LLN is just an asymptotic result with a limit saying that we approach the true value for large value of samples. I am not sure why Chebyshev or LLN would require the condition that $m>4(\frac\alpha2)^2 (\frac\delta2)$, it seems a bit odd (since Chebyshev is not a result for large values). Also, $\delta$ is never really specified, currently I think they might mean difference between true and estimate; however, I am not sure what it's suppose to mean. Also, some things I have realized is that even though Chebyshev's uses bounds with variances, we don't need the variance because we can bound it by $1/4$. Also, I have used the fact that $\Pr[A]+\Pr[A']=1$ to get the inequality in Chebyshev's pointing in the right direction but I am now not sure how to proceed.
A more general question that I have is, I am a little confused on when in probability bounds/inequality we use statements like $m>{}$(some function) and then a probability or a condition applies?
If anyone knows how to do this or has any ideas what to try out, it would be greatly appreciated! Suggestion of things to do are acceptable answers.
[Note that this is not a homework question.] [Context: translated from lectures notes for a course I am reading about.]