Setting:
At time $0$, an urn contains $1$ black ball and $1$ white ball. At each time $n ≥ 1$ a ball is chosen uniformly at random from the urn and is replaced together with a new ball of the same colour. Just after time $n$, there are therefore $n + 2$ balls in the urn, of which $B_n + 1$ are black, where $B_n$ is the number of black balls chosen by time $n$. We let $F_n = σ(B_1,...,B_n)$ for $n ≥ 1$ and $F_0 = \{∅,Ω\}$.
I want to show: $\mathbb{P}(B_{n+1}=b | \mathcal{F}_n)=\frac{B_n +1}{n+2}\chi_{B_n = b-1}+ \frac{n+1-B_n}{n+2}\chi_{B_n = b} $
I think that this comes from: $\mathbb{P}(B_{n+1}=b | \mathcal{F}_n)=\mathbb{P}(B_{n+1}=b | B_{n}=b-1)\chi_{B_n = b-1}+ \mathbb{P}(B_{n+1}=b | B_{n}=b)\chi_{B_n = b} $ This of course intuitively makes sense. But why is it true formally?
I tried to state a general version of this:
Let $X$ be a real, discrete random variable taking values $\{x_1, \dots, x_n\}$. Let $\mathcal{F}$ be a $\sigma$-algebra. If there is a partition $\{F_i\}_i \subset \mathcal{F}$ of $\Omega$, we have: \begin{equation} \mathbb{E}(\chi_{X=x_i} | \mathcal{F})=\sum_{j=1}^n \mathbb{E}(\chi_{X=x_i} | F_j) \chi_{F_j} \end{equation}
I tried to show this via the unique property $\forall B \in \mathcal{F}: | \mathbb{E}(\chi_B \chi_{X=x_i})= \mathbb{E}(\chi_B \sum_{j=1}^n \mathbb{E}(\chi_{X=x_i} | F_j) \chi_{F_j}) $. But I failed, can someone maybe tell me if my general statement is true or where the property above comes from. Thank you!
The general statement is false, which can be easily seen if you take $\{\Omega\}$ as the partition. The right side then reduces to $\mathbb E(\chi_{X=x_i}|\Omega)\chi_{\Omega} = \mathbb E[\chi_{X=x_i}]$ and it's obviously not true that every random variable is equal to its expectation.
In fact, you can show that your right side is just $\mathbb E(\chi_{X=x_i}|F_1,\dots, F_n)$, conditioning on the $\sigma$ algebra generated by the partition, so the statement you want to show and "intuitively makes sense" in your question is actually $$ \mathbb P(B_{n+1}=b|\mathcal F_n) = \mathbb P(B_{n+1}=b|B_n). $$ You want to show that $B_{n+1}$ only depends on $B_n$, and not on the rest of $\mathcal F_n$. This property actually has a name: It is called the Markov property and what fulfils this property is called a Markov chain. But actually it is easier to proof the original statement. Just note that according to the text, the ball chosen in step $n+1$ only depends on the number of balls in the urn. This means $$ \mathbb P(B_{n+1}-B_{n} = 1 |\mathcal F_n) = \frac{B_n+1}{n+2} $$ a.s. and we can get the statement as \begin{align} \mathbb P(B_{n+1} = b |\mathcal F_n) &= \mathbb P(B_n=b, B_{n+1}-B_{n} = 0 |\mathcal F_n) +\mathbb P(B_n=b-1, B_{n+1}-B_{n} = 1 |\mathcal F_n) \\ &= \chi_{B_n=b}\mathbb P(B_{n+1}-B_{n} = 0 |\mathcal F_n) +\chi_{B_n=b-1}\mathbb P(B_{n+1}-B_{n} = 1 |\mathcal F_n) \\&= \frac{B_n+1}{n+2}\chi_{B_n=b} + \frac{n+1-B_n}{n+2}\chi_{B_n=b-1}. \end{align} In the second step, we used the fact that for some $\sigma$ algebra $\mathcal F$, a $\mathcal F$-measurable variable $Y$ can be factored out as $\mathbb E(XY|\mathcal F) = Y\mathbb E(X|\mathcal F)$ a.s.
Edit: If we now turn back to the Markov property, we see that the right side of the equation only depends on $B_n$, so $\mathbb E(B_{n+1}| \mathcal F_n)$ is $\sigma(B_n)$-measurable and thus equal to $\mathbb E(B_{n+1}| B_n)$. If you can't see why $\sigma(B_n)$-measurability implies that, see that \begin{align} \mathbb E(B_{n+1}=b|B_n) &= \mathbb E\left[\mathbb E(B_{n+1}=b|\mathcal F_n)|B_n\right] \\ &= \mathbb E\left[\frac{B_n+1}{n+2}\chi_{B_n=b} + \frac{n+1-B_n}{n+2}\chi_{B_n=b-1} | B_n \right] \\ &= \frac{B_n+1}{n+2}\chi_{B_n=b} + \frac{n+1-B_n}{n+2}\chi_{B_n=b-1} \\ &= \mathbb E(B_{n+1}=b|B_n). \end{align}
Edit 2: That the ball drawn in step $n$ does only depend on $B_n$ in $\mathcal F_n$ is only written in the text or rather only implied. We assume that a uniformly random ball is chosen, that means, all balls have the same probability to be picked, and that this is not dependent on the balls picked before ($\mathcal F_n$).