Consider flipping infinitely many coins and recording the result in a string $x_0 x_1 x_2 \dots$ Let $s_t$ be the substring of length $n$ starting from position $t$.
Let $\tau_n = \min(t > 0 : s_0 = s_t)$ be the random variable calculating the first time from when the initial substring $s_0$ is repeated. What is $\Bbb{E}(\tau_n)$?
It is easy to show $\Bbb{E}(\tau_1) = 2$: in the case that $n = 1$, what we are calculating is simply the average minimum $t > 0$ such that $x_t = x_0$, and we can take advantage of the independence of the coin flips.
The probability that $x_t = x_0$ and $x_i \ne x_0 \ \ \forall \ 0 < i < t$ is $\left(\frac12\right) \cdot \left( \frac12 \right)^{t-1}$, then $$\Bbb{E}(\tau_1) = \sum_{n=1}^\infty n \cdot \Pr(\tau_1 = n) = \sum_{n=1}^\infty n \cdot \left(\frac12\right) \cdot \left( \frac12 \right)^{n-1} = 2$$
Experimental data suggests $\Bbb{E}(\tau_n) = 2^n$. I considered an inductive argument: waiting for a repeat of $n+1$ flips necessarily requires a repeat of $n$ flips, and then additional wait time until the last, single $n+1$-th character occurs - a wait time equal in distribution(?) to $\tau_1$, in which case $\Bbb{E}(\tau_{n+1}) = \Bbb{E}(\tau_n) \cdot \Bbb{E}(\tau_1)$?
I don't know how to make this rigorous though.
Noting out front that this does not give a solution.
I think a direct calculation might be easier than an induction. After trying a couple of things, I ended up finding it easiest to use a trick sometimes called the bathtub formula, or sometimes the Darth Vader formula (ugh haha...), which states that \[X \geq_{\text{a.s.}} 0 \implies \mathbb{E}(X) = \int P(X>x) \, dx.\]
Using this formula, \begin{align*} \mathbb{E}(\tau_n) &= \sum_{t=0}^\infty P(\tau_n > t)\\ &= 1 + \sum_{t=1}^\infty P(\tau_n > t), \end{align*} where the 1 in the last line appears since surely $\tau_n>0$.
So, let's find \begin{align*} \qquad\qquad\qquad % for centering, pls ignore. P(\tau_n > t) &=\prod_{k=1}^t P(s_t \neq s_0) \qquad\qquad\qquad (*)\\ &=\prod_{k=1}^t 1 - P(s_t=s_0)\\ &=\prod_{k=1}^t 1 - 2^{-n}\\ &=(1-2^{-n})^t. \end{align*}
Plugging back in, $\mathbb{E}(\tau_n)=1+\sum_{t=1}^\infty (1 - 2^{-n})^t=1+2^{n}-1=2^n$ which agrees with your experiments.
As mentioned in a comment, the line $(*)$ needs justification. Clearly $\{t_n > t\}=\bigcap_{k=1}^t \{ s_k \neq s_0 \}$, and we claim that these events are independent (which is surprising). Indeed, consider the joint distribution of just two of them. This can be broken down into a product of exclusive-or events $x_i\neq x_j$, which are independent: \begin{align*} P(s_1\neq s_0,s_2\neq s_0) &= P((x_1\neq x_0\cup x_2\neq x_1 \cup\dots\cup x_n\neq x_{n-1})\cap(x_2\neq x_0\cup x_3\neq x_1\cup\dots\cup x_{n+1}\neq x_{n-1})). \end{align*} I claim that for any $i\neq j$ and any $p,q$, the binary random variables $y_{i,p}=x_i\neq x_p$ and $y_{j,q}=x_j\neq x_q$ are independent. This is clearly true if $p\neq q$. If $p=q$ this is follows because the xor operation $\neq$ preserves the independence of $x_i$ and $x_j$, because if $x_p=1$ it just flips the bits and if $x_p=0$ it does nothing. By exchanging the roles of $i,j$ and $p,q$ we see that $y_{i,p}$ and $y_{j,q}$ are also independent for any $i,j$ as long as $p\neq q$.
Well, this is as far as I've gotten with this method. To finish it off, we would need joint independence of the $y_{i,p}$ random variables, since $\sum_i y_{t+n+i,i}>0$ happens when $\{ s_t \neq s_0 \}$, and since these sums are independent under joint independence of $y_{i,p}$. Judging from comments on this answer, this is probably not going to work.