I am self-studying Kolmogorov complexity, and for its purposes, (Sipser exercise 6.26) I am trying to prove that for constant c, as n approaches infinity, the ratio of bitstrings of length n that contain $(1)^{c}$, to the ratio of bitstrings of length n approaches 1. (More than c consecutive 1s are acceptable as well, and not all 1s of the string need to be consecutive.) It's hard to do without counting good strings more than once.
https://math.stackexchange.com/q/1530139 Seems to answer the question for specific n and c, but it doesn't seem easy to calculate as n tends to infinity and for general c.
Here is a probabilistic way. Fix $c \geq 1$ and consider $n$ large enough. Consider choosing a bitstring $\mathbf{x}$ of length $n$ uniformly at random; or equivalently, each bit $x_i$ is independently $0$ or $1$ with probability $1/2$. Now the aim is to show that $$ \mathbb{P}(\mathbf{x} \not\ni (1)^c) = o(1) \quad(*) $$ (where the notation on the left-handside means the random bitstring contains a sequence of $c$ consecutive $1$'s somewhere). If we show this, then this implies that $\mathbb{P}(\mathbf{x} \ni (1)^c) \to 1$ as $n \to \infty$, or equivalently that the ratio of bitstrings with the desired property to all bitstrings tends to one.
To show $(*)$: Consider chopping up the $n$ bits into $n / c$ consecutive blocks of $c$ bits (we will assume that $n/c \in \mathbb{Z}$ for simplicity, but the general argument only requires an easy modification). Then if the event on the left-hand side happens, it must be that each of these $n/c$ blocks must all be different from a sequence of $c$ 1's. For each block, this happens with probability $1 - 2^{-c}$. Thus, by independence, we have that $$ \mathbb{P}(\mathbf{x} \not\ni (1)^c) \leq (1 - 2^{-c})^{n/c}. $$ Since $c$ is fixed, it follows that the upper bound tends to 0 as $n \to \infty$.