When we toss an unbiased coin, the probability of observing both heads and tails is 1/2. I take that to mean that over a really large number of coin tosses the number of times the coin will turn heads will be almost equal* to the number of times the coin will turn tails.
My question is, if we witness a series of coin tosses that happens to have many more number of, say heads, than tails then will we not expect the upcoming coin tosses to be 'mean-reverting' i.e. more inclined to produce tails than heads - only in order to maintain the definition of probability being 1/2 as per the previous paragraph?
I think what I am really asking is whether an unbiased coin-tossing is a Markov process. I would add that if it is, then my understanding of why the probability of heads/tails is 1/2 is wrong.
[*] - If not, then we need more coin tosses such that the ratio of number of heads (or tails) to the total number of coin tosses approaches 1/2 as the number of coin tosses approaches infinity.
Intuition can be gained from the fact that these averages are completely determined by the tail values, and not the first few terms. That is, for any fixed $k∈\Bbb N$ and any finite number of terms $b_1,…,b_k$ (even if $b_i ≠ a_j$ for every $i,j$)
$$ \lim_{n→∞} \frac{1}{n}(a_1+…+a_n) = \lim_{n→∞} \frac{1}{n}(b_1+…+b_k+a_{k+1} + … + a_n) $$
Hence, there is no need for the $a_j$s to 'correct the bad behaviour of $b_i$s'; the $b_i$s simply don't matter in the long run.