Let $V$ be an absolutely continuous random variable on support $[0,1]$, with density $f$. Assume there exists $ 0<K<\infty$ s.t. $0<f(x)<K$ for all $x\in [0,1]$.
Define a sequence $\{v_{t}\}$ for some initial $v_{0},\, v_1 \in (0,1)$, $v_{0}<v_{1}$ such that $v_n$ is the unique number in $[0,1]$ that solves \begin{equation} \begin{aligned} v_{n-1} = \mathbb{E}\big[v\, | v\in [v_{n-2},v_{n}]\big] = \frac{\int_{v_{n-2}}^{v_{n}}xf(x)dx}{\int_{v_{n-2}}^{v_{n}}f(x)dx}, \end{aligned} \end{equation} for $n \ge 2$. Boundedness of $f(x)$ ensures uniqueness. I want to show that this sequence cannot converge in $[0,1]$.
My intuition is that since convergence requires $v_n-v_{n-1}$ to vanish, it would be necessary for the density to increase more and more in order to accommodate that $v_n - v_{n-1} \le v_{n-1} - v_{n-2}$.
Does anybody have an idea how to tackle this, or if actually the statement is wrong and there exists such a sequence that converges in the interval?