Consider a binary symmetric channel with probability of error $p$.
Consider the following scheme of encoding a message for transmitting one bit.
If we have to transmit 0 then we send certain number of zeros $000 \cdots 0$. If we have to transmit 1 then we send certain number of ones $111 \cdots 1$.
The scheme for decoding would be as follows.
Observe output sequence $output$
Decide $output = 0$ if majority of zeros in $output$.
Decide $output = 1$ if majority of ones in $output$.
My question is the following: how can I proof that the probability of getting an error in this process goes to zero as $n$ (the number of zeros we send) goes to infinity. I'm supposed to use the Law of Large Numbers but I'm not sure how to.

Let $X$ denote the total number of flipped bits among the $n$ transmitted bits.
$X$ is a random variable following the binomial distribution $Bin(n, p)$, with average $~E[X] = np~$ and variance (sqaure of standard deviation) $~V[X] = npq~$, where $q = 1-p$.
We get an error when the fraction of the bits (among $n$) that are flipped is over half. Thus we are interested in the scaled random variable $$\begin{align} Y &\equiv \frac{X}n \quad \text{with} & E[Y]&=\frac{E[x]}n = p\\ && V[Y]&=\frac{ V[X]}{n^2} = \frac{pq}{n} \end{align}$$
Therefore, we need the strict $p < \frac{1}2$ to make the whole process work, such that
Slightly more formally, the probability of $Y$ (the fraction of interest) being at any distance $\Delta$ away from the mean $p$ is
$$ Pr\bigg\{ ~ |Y - p| \geq \Delta ~\bigg\} \leq \left(\frac{\sigma}{\Delta} \right)^2 \qquad \to 0 \quad \text{as} \quad n \to \infty$$
This is a common and elementary form of Chebychev inequality.
This is exactly the Law of Large Numbers applied in this specific context: the vanishing variance of the `sample mean' $Y$ results in the average of the sample mean $E[Y]$ approaching the true mean $p$, where $B_k$ are iid Bernouli random variables of two outcomes (success with $p$ failure with $q$), $E[B_k] = p$ for all k, and $X \equiv \sum_{k = 1}^n B_k$.