Let $\{B_j(p) | j \geq 1\}$ be a sequence of Bernoulli trials or an i.i.d. sequence of Bernoulli$(p)$ random variables, where the win probability is
$$p = P\{B_j(p) = 1\} = E[B_j(p)].$$ Now define $X$ to be the negative binomial random variable that counts the wins within all the Bernoulli$(p)$ trial winning streaks until the first $\ell$ losses. This means that we can say that
$$X = \widetilde{B}_1(p) \oplus \cdots \oplus \widetilde{B}_{\ell}(p),$$ where $\widetilde{B}_j(p)$ is the $j$-th independent Geometric$(p)$ random variable. Your mission is to show that whenever $\ell \geq 1$, we have
$$E\left[1 - B_1(p) \mid X\right] = \frac{\ell - 1}{\ell - 1 + X}, \quad \text{hence} \quad E\left[\frac{\ell - 1}{\ell - 1 + X}\right] = 1 - p. $$
Hint: Consider $E[B_j(p) \mid X = k]$ for all integers $j$ ranging from 1 to $k + \ell - 1$.
what i have so far: For any discrete $Y$ that takes on the values $y_1,\ldots,y_n$, we define $E[X | Y]$ to be the random variable $$ E[X | Y] \equiv \sum_{i=1}^n E[X | Y = y_i] \cdot \{Y = y_i\} = f(Y), $$ where $$ f(y_i) \equiv E[X | Y = y_i]. $$
Given this definition, we consider the conditional expectation of $B_j(p)$ given $X$: $$ E[B_j(p) | X] = \sum_{i=1}^{k+\ell-1} E[B_j(p) | X = i] \cdot \{X = i\} $$ This is effectively $f(X)$, where $f(x_i) = \sum_{i=1}^{k+\ell-1} E[B_j(p) | X = x_i] \cdot \{X = x_i\}$.
Thus, we have defined the conditional expectation to be that random variable. Now, we calculate the expectation of the conditional expectation: \begin{align*} E[E[B_j(p) | X]] &= E\left[\sum_{i=1}^{k+\ell-1} E[B_j(p) | X = i] \cdot \{X = i\}\right] \\ &= \sum_{i=1}^{k+\ell-1} E[B_j(p) | X = i] \cdot E[\{X = i\}] \\ &= \sum_{i=1}^{k+\ell-1} E\left[\frac{B_j(p) \cdot \{X = i\}}{P\{X = i\}}\right] \cdot P\{X = i\} \\ &= E\left[B_j(p) \cdot \sum_{i=1}^{k+\ell-1} \{X = i\}\right] \\ &= E[B_j(p)] \end{align*}
But this should give me $\frac{k}{{k+\ell-1}}$ which it does not. What am i doing wrong?