Polya urn model: At time $0$ an urn initially contains $b$ $\tt{B}$lue balls and $r$ $\tt{R}$ed balls. At time $1$, a ball is drawn uniformly at random (removing it) from the urn, and two balls of the drawn color are added to the urn. This procedure is then repeated at times $2,3,4,\ldots$, each time increasing by one the total number of balls in the urn while randomly changing its composition, as follows:
$$(x,\ y) \longrightarrow\begin{cases} (x+1,\ y), & \text{with probability $x\over x+y$} \\[2ex] (x,\ y+1), & \text{with probability $y\over x+y$} \end{cases}$$
where $x$ (resp. $y$) denotes the number of blue (resp. red) balls.
Now let $T_w$ denote the time of the first occurrence of word $w\in\{\tt{B,R}\}^*$ in the sequence of drawn colors. (E.g., we have $T_\epsilon=0$ always, and in the sequence $\tt{RBRBB}\ldots$ we have $T_{\tt{B}}=2,\ T_{\tt{BB}}=5$.)
Question: Is there a known closed form for $E(T_w)$ in any case other than $T_{\tt \epsilon}$, $T_{\tt B}$, or $T_{\tt R}$? If so, what is it, and how to derive it?
Note 1: $E(T_{\tt B})$ (and similarly $E(T_{\tt R})$) can be found as follows. (The resulting form is simple -- is there a simpler way than this to derive it?) Let $s=b+r$. Using (rising) Pochhammer symbols $$(a)_n = \begin{cases} 1 &\text{ if }n = 0 \\ a(a+1)(a+2)\cdots (a+n-1) &\text{ if } n > 0 \end{cases}$$ we can write the expectation in the form of a hypergeometric series, and apply well-known identities :
$$\begin{align}E(T_{\tt B})&={b\over s}1+{r\over s}{b\over s+1}2+{r\over s}{r+1\over s+1}{b\over s+2}3+{r\over s}{r+1\over s+1}{r+2\over s+2}{b\over s+3}4+\cdots\\[2ex] &={b\over s}\sum_{k=0}^\infty{(r)_k\over (s+1)_k}(k+1)\\[2ex] &={b\over s}\sum_{k=0}^\infty{(r)_k (2)_k\over (s+1)_k}\,{1^k\over k!}\\[2ex] &={b\over s}\ {_2F_1}(r,2;s+1;1)\\[2ex] &={b\over s}\ {\Gamma(s+1)\,\Gamma(b-1)\over \Gamma(b+1)\,\Gamma(s-1)}\\[2ex] &={b\over s}\ {s\,(s-1)\over b\,(b-1)}\\[2ex] &={s-1\over b-1} \end{align}$$ where we've used the following identities: (1) $(2)_k=(k+1)!\ $, (2) Gauss's theorem: $${_2F_1}(a,b;c;1)= {\Gamma(c)\,\Gamma(c-a-b)\over \Gamma(c-a)\,\Gamma(c-b)}\quad\text{if $c>a+b$},$$ and (3) $\Gamma(a+1)=a\Gamma(a)=a(a-1)\Gamma(a-1)$. (Note that $E(T_{\tt B})<\infty$ only for $b>1$.)
Note 2: $(E(T_{\tt BB})=?)\ \ $ Applying the same method as above and noting that there are $\binom{i}{j-2}$ sequences of length $i+j$ that contain $i$ ${\tt R}$s and $j$ ${\tt B}$s with ${\tt BB}$ occurring only at the end, we have the following:
$$\begin{align}E(T_{\tt BB})&=\sum_{j=2}^\infty\sum_{i=j-2}^\infty{(r)_i(b)_j\over (s)_{i+j}}\binom{i}{j-2}(i+j)\\[2ex] &=\sum_{j=2}^\infty(b)_j\sum_{i=j-2}^\infty{(r)_i\over (s)_{i+j}}\binom{i}{j-2}(i+j)\\[2ex] &=\sum_{j=2}^\infty\left\{(b)_j\sum_{i=j-2}^\infty{(r)_i\over (s)_{i+j}}\binom{i}{j-2}i +(b)_j j\sum_{i=j-2}^\infty{(r)_i\over (s)_{i+j}}\binom{i}{j-2}\right\}\\[2ex] \end{align}$$
I've tried without success to reduce this to generalized hypergeometric series $\ {_pF_q}\ $ with some special argument value as before, but the summations seem to be intractable. Does anyone know how this might be done? (Or know of a simpler method?)
EDIT: By inspecting pseudorandom simulations for a variety of $(b,r)$ values, it appears that when $r\ge 0$ and $b\ge3$, we have $$E(T_{\tt BB})=\ \left(1+{r\over b-1}\right)\left(2+{r\over b-2}\right).$$ Unable to show that the above series reduces to this, I've asked for proof in a separate question.
I cross-posted this question to MathOverflow, where it was answered in the affirmative. Here I summarize that answer and apply it to the general case of an arbitrary finite pattern-string $w$, also working out a couple of examples for illustration. The key point of that answer is this (which I quote):
Proof of this fact can be found online, e.g. in these articles: Pólya's Urn Process, The Beta-Bernoulli Process.
Therefore, $$\begin{align}E(T_{w})=E\,E(T_{w}\mid\tilde{p})=\int_\limits{0}^{1} E(T_{w}\mid \tilde{p}=p)\,f(p)\,dp\tag{1}\\ \end{align}$$ where $\tilde{p}$ is a random variable whose density function $f(p)$ is that of a Beta$(b,r)$ distribution, i.e., $$f(p) = {1\over B(b,r)}p^{b-1}(1-p)^{r-1}\ [0<p<1]$$ and $T_w$ (given $\tilde{p}=p$) is the first-occurrence time of word $w\in\{\tt{B,R}\}^*$ in a sequence of Bernoulli trials $X_1,X_2,\ldots$ in which $P({X_i=\tt B})=p,\ P(X_i={\tt R})=1-p.$
Mean first-occurrence time of an arbitrary word
It can be proved (e.g. in this online article) that for such a sequence of Bernoulli trials, given $\tilde{p}=p$, the mean first-occurrence time of an arbitrary word $w\in\{\tt B,R\}^*$ is $$E(T_{w}\mid\tilde{p}=p) = {\sum}_\limits{s\in S_w}\,{p^{-b_s}(1-p)^{-r_s}}\tag{2} $$ where $S_w$ is the set of bifixes (i.e. nonempty suffixes that are also prefixes) of the word $w$, and $b_s$ (resp. $r_s$) is the number of $\tt B$s (resp. $\tt R$s) in $s$.
Now inserting (2) into (1), we readily find $E(T_w)$:
$$\begin{align}E(T_{w})&=\int_\limits{0}^{1}\left({\sum}_\limits{s\in S_w}\,{p^{-b_s}(1-p)^{-r_s}}\right)\,\left({1\over B(b,r)}{p^{b-1}(1-p)^{r-1}}\right)\,dp\tag{3}\\[2ex] &={1\over B(b,r)}{\Large\sum}_\limits{s\in S_w}{\int_\limits{0}^{1}p^{b-b_s-1}(1-p)^{r-r_s-1}\,dp}\tag{4}\\[2ex] &={1\over B(b,r)}{\Large\sum}_\limits{s\in S_w}{B(b-b_s,r-r_s)}\tag{5}\\[2ex] \end{align}$$ which is the desired closed form for the general case. (Note that the expectation fails to exist for any word $w$ that has a bifix $s$ containing at least $b\ {\tt B}$s or at least $r\ {\tt R}$s -- i.e. if $b_s\ge b$ or $r_s\ge r$.)
NB: If $x$ and $y$ are positive integers, the Beta function is such that $$B(x,y)={(x-1)!\,(y-1)!\over (x+y-1)!}, $$ so Eq. (5) shows that when $E(T_{w})$ exists, it is just a finite sum of products of ratios of factorials. More explicitly, $$\begin{align}E(T_{w})&={\Large\sum}_\limits{s\in S_w}\left[\left({\Large\prod}_\limits{k=1}^{b_s}{b-k+r-r_s\over b-k} \right) \left({\Large\prod}_\limits{l=1}^{r_s}{b+r-l\over r-l}\right)\right]\tag{5$^\prime$}, \end{align}$$ obtained by rewriting ${B(b-b_s,r-r_s)}$ via repeated application of the following identities: $$\begin{align}B(x,y)&={x+y\over x}B(x+1,y)\quad (x>0)\\[2ex] B(x,y)&={x+y\over y}B(x,y+1)\quad (y>0). \end{align} $$
Examples
Example 1 ($w={\tt B}^n$, a block of $n$ consecutive $\tt B$s)
The word ${\tt B}^{n}\,(n\ge 1)$ has exactly $n$ bifixes, namely ${\tt B}, {\tt BB}, \ldots,{\tt B}^{n}$; hence (5) gives $$\begin{align}E(T_{{\tt B}^n})&={1\over B(b,r)}{\Large\sum}_\limits{s\in \{{\tt B, BB,\ldots,B}^n \}}{B(b-b_s,r-r_s)}\\[2ex] &={1\over B(b,r)}{\Large\sum}_\limits{i=1}^{n}{B(b-i,r)}\\[2ex] &={\Large\sum}_\limits{i=1}^{n}{\Large\prod}_\limits{k=1}^{i}\frac{b-k+r}{b-k}. \end{align}$$
With $n=1$ this reproduces the fact (derived in my question) that $E(T_{{\tt B}})={b-1+r\over b-1}$, and with $n=2$ this confirms my conjecture that $$E(T_{\tt BB})\ =\ {b-1+r\over b-1}+{(b+r-1)(b+r-2)\over (b-1)(b-2)} \ =\ \left(1+{r\over b-1}\right)\left(2+{r\over b-2}\right).$$
Example 2 ($w={\tt BRB}$)
The word $\tt BRB$ has exactly two bifixes, namely $\tt B$ and $\tt BRB$; hence, (5) gives $$\begin{align}E(T_{\tt BRB})&={1\over B(b,r)}{\Large\sum}_\limits{s\in \{\tt B, BRB\}}{B(b-b_s,r-r_s)}\\[2ex] &={1\over B(b,r)}{\large(}B(b-1,r) + B(b-2,r-1){\large)}\\[2ex] &={b-1+r\over b-1}+{(b-1+r)(b-2+r)(b-3+r)\over(b-1)(b-2)(r-1)}. \end{align}$$