Many Markov chains exhibit a sharp cutoff in convergence to stationarity. In terms of total variation distance, the distance stays close to $1$ and then rapidly drops to $0$ at a certain time.
Every definition I have seen for the cutoff phenomenon involves sequences of Markov chains. For example, from Markov Chains and Mixing Times (by Levin, Peres, and Wilmer):
Suppose for a sequence of Markov Chains indexed by $n = 1, 2, \cdots$, the mixing time for the $n$-th chain is denoted by $t_{mix}^{n}(\epsilon)$. The sequence of chains has a $\textbf{cutoff}$ if for all $\epsilon > 0$ , $$\lim_{n\to\infty} \displaystyle\frac{t_{mix}^{n}(\epsilon)}{t_{mix}^{n}(1 - \epsilon)} = 1$$
I am struggling to understand this definition. Can someone please provide some insight into where it comes from? The cutoff phenomenon refers to what happens over iterations of a fixed chain, so what is the sequence of chains supposed to represent here? In all mixing time results from that book, $n$ refers to the number of steps of a given chain, but here each $n$ refers to an entirely separate chain.