The law of large numbers is the following statement:
$\lim_{n \rightarrow\infty}(P(|S_n/n - \mu| < \epsilon))=1$, for all $\epsilon>0$
Where $S_n = X_1 + X_2 + ...+X_n$ iid random variables. For this example, suppose $X_i = 1$ if the toss outcome was heads, and $0$ otherwise, with $P(X = 1) = P(X=0) = 0.5$.
Suppose I flip a coin a large (but finite) number of times, like $n=10,000.$ Does the law of large numbers make it highly probable that the number of heads I get will deviate from $n/2 = 5,000$ by at most $k$, where $k << n$?
For example, does it specify a lower bound on the expression below?
$P(|S_{10000} - 10,000/2| < 100) \geq ?$
Or does it say nothing about large but finite $n$?
EDIT:
This is where I reached:
$\lim_{n \rightarrow\infty}P(|S_{n}/n - 0.5| < \epsilon) = 1$
$\lim_{n \rightarrow\infty}P(|nS_{n}/n - 0.5n| < n\epsilon) = 1$
$\lim_{n \rightarrow\infty}P(|S_{n} - 0.5n| < n\epsilon) = 1$, for all $\epsilon>0$
The weak law of large numbers essentially only states that the random variable $S_n/n$, describing the average number of heads obtained, tends to its expected value $E\left[\frac{S_n}n\right]=\mu=0.5$ in probability. Graphically, this means that as $n$ grows, the PMF of $S_n/n$ shrinks and becomes increasingly localized around $0.5$.
For the asymptotic behaviour of $S_n$, you have to refer to the Central Limit Theorem which states that the sum of $n$ independent, identically distributed random variables $X_i;i=1,2,3,...$ is asymptotically normal under certain conditions. Also note that in your example, $S_n$ is actually a binomial distribution. The special case of CLT that deals with binomial distributions is the De Moivre-Laplace Theorem.
For large but finite $n$, you may approximate $S_n$ by a normal distribution, in accordance with the CLT. For lower bounds on $P(|S_n-E[S_n]|<k\sqrt{V[S_n]})$ for all positive $k$, the Chebychev's inequality comes to our aid. From the Chebychev's inequality, we see$$P(|S_n-n/2|<\frac{k\sqrt n}2)\ge1-\frac1{k^2}$$or$$P(|S_n-n/2|<m)\ge1-\frac n{4m^2}$$