In a hypothetical world, consider a dice which has $k$ faces labelled from 1 to $k$ and the probability that any face comes up is the same i.e. $1/k$. We use $X_t$ to denote the number that comes up in the $t$-th throw. Given $m\in\Bbb N^+$, and define the number of throws until a run of $m$ identical numbers: $$N:=\inf_{t>0}\{X_{t-m+1},\cdots,X_t \text{ are identical}\}$$ Then what is $\Bbb E(N)$?
My thought: let $A_n$ be event that we have achieved a run of $n$ identical numbers, and $\Bbb E(N\mid A_n),\,n<m$ be the expected additional throws we need to achieve an $m$-run. Then we have:
\begin{align} \Bbb E(N\mid A_m)&=0\\ \Bbb E(N\mid A_n)&=1+\frac1k\Bbb E(N\mid A_{n+1})+\frac{k-1}{k}\Bbb E(N\mid A_1),\quad 1\le n \le m-1\\ \Bbb E(N)&=\Bbb E(N\mid A_1)+1 \end{align} Then the we only have to solve (1), (2) for $\Bbb E(N\mid A_1)$, since (3) is trivial. So it's $m$ equations and $m$ unknowns and we just have to solve this linear system.
This method is perfectly okay, but it is slow by hand even for relatively small $k,m$. Does there exist any other significantly slicker approach that can lead to a solution easily obtainable by hand, say in an interview?
There is a way that is a little simpler (I think) than the direct approach. It requires two main steps.
First we have to show that the expected value $\mathbb E(N)$ is finite. One way is to compare it to the expected number of rolls if we say the rolls can only end with $m$ consecutive identical rolls preceded by a multiple of $m$ rolls. The number of rolls then has a geometric distribution with a finite expectation. Our random variable is different in that we accept $m$ consecutive identical rolls preceded by any number of rolls, not just a multiple of $m,$ so it has an expected value that is no greater than the other variable's.
Now consider the first run of identical rolls. The run has a length of at least one (the very first roll always matches itself!) and has no upper limit on its length, but we are only interested in lengths up to $m$ rolls. (We do not need to consider more than $m$ rolls, because at $m$ rolls we have an $m$-run.)
In the cases where the first run has length $t$ where $t < m$, the conditional expectation $\mathbb E(N \mid \text{first run has length $t$})$ is $t + \mathbb E(N).$ The probability of each such case is $(k-1)k^{-t}.$ The only other case to consider is the case where $t \geq m,$ in which case the conditional expectation is $m.$ The probability of that case is $k^{-(m-1)}.$ So by the law of total expectation,
\begin{align} \mathbb E(N) &= \left(\sum_{t = 1}^{m-1} (t + \mathbb E(N)) \frac{k-1}{k^t}\right) + m\frac{1}{k^{m-1}} \\ &= (k-1) \left(\sum_{t = 1}^{m-1} \frac{t}{k^t}\right) + (k-1) \mathbb E(N) \left(\sum_{t = 1}^{m-1} \frac{1}{k^t}\right) + \frac{m}{k^{m-1}}. \end{align}
Now we have some familiar series to work with, although the fact that they are finite series forces us to do some extra work. Recalling that $\sum_{t = 1}^\infty (1/k^t) = 1/(k-1)$ and $\sum_{t = 1}^\infty (t/k^t) = k/(k-1)^2,$
\begin{align} \sum_{t = 1}^{m-1} \frac{t}{k^t} &= \left(\sum_{t = 1}^\infty \frac{t}{k^t}\right) - \left(\sum_{t = m}^\infty \frac{t}{k^t}\right) \\ &= \left(\sum_{t = 1}^\infty \frac{t}{k^t}\right) - \left(\frac{m-1}{k^{m-1}} \left(\sum_{t = 1}^\infty \frac{1}{k^t}\right) + \frac{1}{k^{m-1}} \left(\sum_{t = 1}^\infty \frac{t}{k^t}\right)\right)\\ &= \frac{k}{(k-1)^2} - \left(\frac{m-1}{k^{m-1}(k-1)} + \frac{k}{k^{m-1}(k-1)^2} \right) \\ &= \frac{k^m - (m - 1)(k - 1) - k}{k^{m-1}(k-1)^2} \\ &= \frac{k^m - 1 - m(k - 1)}{k^{m-1}(k-1)^2}. \end{align}
And there is also the better-known result
\begin{align} \sum_{t = 1}^{m-1} \frac{1}{k^t} &= \left(\sum_{t = 1}^\infty \frac{1}{k^t}\right) - \left(\sum_{t = m}^\infty \frac{1}{k^t}\right) \\ &= \left(1 - \frac{1}{k^{m-1}}\right)\left(\sum_{t = 1}^\infty \frac{1}{k^t}\right)\\ &= \frac{k^{m-1} - 1}{k^{m-1}(k-1)}. \end{align}
Putting this together,
\begin{align} \mathbb E(N) &= \frac{k^m - 1 - m(k - 1)}{k^{m-1}(k-1)} + \frac{k^{m-1} - 1}{k^{m-1}} \mathbb E(N) + \frac{m}{k^{m-1}}. \end{align}
Multiply through by $k^{m-1}$ and simplify:
\begin{align} k^{m-1} \mathbb E(N) &= \frac{k^m - 1 - m(k - 1)}{k-1} + (k^{m-1} - 1) \mathbb E(N) + m\\ &= \left(\frac{k^m - 1}{k-1} - m\right) + (k^{m-1} - 1) \mathbb E(N) + m. \end{align}
Canceling some terms and collecting $\mathbb E(N)$ on the left,
\begin{align} \mathbb E(N) &= \frac{k^m - 1}{k-1}. \end{align}
And as yet another way of looking at it, when you throw the die for the first time, or whenever you throw a number different from the previous number, starting a new run, the probability that this run will be an $m$-run (or longer) is $k^{-(m-1)}.$ The number of runs prior to the successful run is therefore a (shifted) geometric distribution with support on $t = \{0, 1, 2, \ldots\}$ and expected value $k^{m-1}-1.$
Within each of the runs prior to the successful run, the distribution of the length of the run is a kind of truncated geometric distribution, that is, if $X$ is the length of the run then $$ \mathbb P(X=t) = \frac1{\sum_{u = 1}^{m-1} \frac{k-1}{k^u}} \left( \frac{k-1}{k^t} \right). $$ But $$ \sum_{u = 1}^{m-1} \frac{k-1}{k^u} = (k - 1)\frac{k^{m-1} - 1}{k^{m-1}(k-1)} = \frac{k^{m-1} - 1}{k^{m-1}}, $$ so $$ \mathbb P(X=t) = \frac{k^{m-1}}{k^{m-1} - 1} \left( \frac{k-1}{k^t} \right). $$
The expected length of the run (conditioned on it being a run of length less than $m$) is \begin{align} \sum_{t = 1}^{m-1} t\mathbb P(X=t) &= \frac{k^{m-1}(k-1)}{k^{m-1} - 1} \sum_{t = 1}^{m-1} \frac{t}{k^t} \\ &= \frac{k^{m-1}(k-1)}{k^{m-1} - 1} \left( \frac{k^m - 1 - m(k - 1)}{k^{m-1}(k-1)^2}\right) \\ &= \frac{k^m - 1 - m(k - 1)}{(k^{m-1} - 1)(k-1)}. \\ \end{align}
But we expect $k^{m-1}-1$ such runs, so the total number of expected rolls is $$ (k^{m-1}-1)\sum_{t = 1}^{m-1} t\mathbb P(X=t) = \frac{k^m - 1 - m(k - 1)}{k-1} = \frac{k^m - 1 }{k-1} - m. $$
Add $m$ for the $m$ consecutive identical rolls at the end, and the total is $$ \frac{k^m - 1 }{k-1} .$$