Prove that $\mathbb{E}(T)$ is infinite for the problem

161 Views Asked by At

A random number generator produces independent random variates $x_0, x_1, x_2, . . .$ drawn uniformly from $[0, 1]$, stopping at the first $x_T$ that is strictly less than $x_0$. Prove that the expected value of $T$ is infinite.

Suggest, with a brief explanation, a plausible value of $Pr(T = ∞)$ for a real-world (pseudo-)random number generator implemented on a computer.

I haven't come across a continuous problem like this before, I'm not sure how to prove it.

2

There are 2 best solutions below

3
On BEST ANSWER

Probability of lasting $T$ for a given $x_0$: $$ x_0(1-x_0)^{T-1}\tag{1} $$ Expected value of $T$ for a given $x_0$: $$ \begin{align} \sum_{T=1}^\infty Tx_0(1-x_0)^{T-1} &=-x_0\frac{\mathrm{d}}{\mathrm{d}x_0}\sum_{T=0}^\infty(1-x_0)^T\\ &=-x_0\frac{\mathrm{d}}{\mathrm{d}x_0}\frac1{1-(1-x_0)}\\ &=-x_0\frac{\mathrm{d}}{\mathrm{d}x_0}\frac1{x_0}\\ &=\frac1{x_0}\tag{2} \end{align} $$ By the linearity of expectation, we can compute the the expected value of $T$ for a random $x_0\in[0,1]$: $$ \int_0^1\frac1{x_0}\,\mathrm{d}x_0=\infty\tag{3} $$

5
On

$$ \begin{align} E(T) &= \sum_{n = 1}^\infty n\ P(T = n) \\ &= \sum_{n = 1}^\infty n\ P\left(\{X_1\geq X_0\}\cap \dots\cap\{X_{n-1} \geq X_0\}\cap\{X_n < X_0\}\right) \\ &= \sum_{n = 1}^\infty n\ E\left(P\left(\{X_1\geq X_0\}\cap \dots\cap\{X_{n-1} \geq X_0\}\cap\{X_n < X_0\} \big|\ X_0\right)\right) \\ &= \sum_{n = 1}^\infty n\ E\left(P(X_1 \geq X_0\ |\ X_0) \cdots P(X_2 \geq X_0\ |\ X_0)P(X_n < X_0\ |\ X_0)\right) \tag{1}\label{Cynthia} \\ &= \sum_{n = 1}^\infty nE\left((1-X_0)^{n-1}X_0\right) \tag{2}\label{George} \\ &= \sum_{n = 1}^\infty n \int_0^1 (1-x)^{n-1} x\ dx\\ &= \sum_{n = 1}^\infty n B(2, n) \int_0^1 f_{\beta(2,n)}(x)\ dx \tag{3}\label{Adam} \\ &= \sum_{n = 1}^\infty n B(2, n) \\ &= \sum_{n = 1}^\infty n\cdot \frac{(n-1)!}{(n+1)!} \tag{4}\label{Ben} \\ &= \sum_{n = 1}^\infty \frac{1}{n+1} \\ &= \infty, \end{align} $$

where in \eqref{Adam} I used the $\beta$ distribution, and in \eqref{Ben} I represented the Beta function in terms of factorials.

Steps \eqref{Cynthia} and \eqref{George} are intuitively compelling, but formal proofs are not trivial:

Step (1) The $\sigma$-algebras $\sigma(X_0, X_1), \sigma(X_0, X_2), \dots$ are conditionally independent given $\sigma(X_0)$ (w.r.t. $P$).

Proof Accoding to the penultimate paragraph on p. 109 of [1], it suffices to show that, for every $n \in \{1, 2, \dots\}$, $$ \sigma(X_0, X_1, \dots, X_n) \perp_{\sigma(X_0)} \sigma(X_0, X_{n+1}). $$ According to [1] Corollary 6.7 on p. 110, this is tantamount to showing that, for every $n \in \{1, 2, \dots\}$, $$ \sigma(X_0, X_1, \dots, X_n) \perp_{\sigma(X_0)} \sigma(X_{n+1}). \tag{5}\label{Dan} $$

Let $n \in \{1, 2, \dots\}$. According to [1] Proposition 6.6 ("conditional independence, Doob") on p. 110, in order to show \eqref{Dan} it suffices to show that, for every $H \in \sigma(X_{n+1})$, $$ P\left(H\ |\ \sigma(X_0, X_1, \dots, X_n)\right) \overset{P-\mathrm{a.s.}}{=} P(H\ |\ \sigma(X_0)). $$ But, since $\sigma(X_0), \sigma(X_1), \dots, \sigma(X_{n+1})$ are $P$-independent, then, by [1] Corollary 3.7 ("grouping") on p. 51, $$ \sigma(X_0, X_1, \dots, X_n) \perp \sigma(X_{n+1}). $$

Therefore, by the comment in the beginning of p. 105 of [1], $$ P\left(H\ |\ \sigma(X_0, X_1, \dots, X_n)\right) \overset{P-\mathrm{a.s.}}{=} P(H) \overset{P-\mathrm{a.s.}}{=} P(H\ |\ \sigma(X_0)). $$

Q.E.D.

Step (2) For every $n \in \{1, 2, \dots\}$, $$ \begin{align} P\left(X_n < X_0\ |\ \sigma(X_0)\right) &\overset{P-\mathrm{a.s.}}{=} X_0, \\ P\left(X_n \geq X_0\ |\ \sigma(X_0)\right) &\overset{P-\mathrm{a.s.}}{=} 1 - X_0. \end{align} $$

Proof Let $n \in \{1, 2, \dots\}$. Define $B := \{(x,y) \in \mathbb{R}^2 \ |:\ x > y\}$. Observe that $B$ is a Borel set. Then $$ P\left(X_n < X_0\ |\ \sigma(X_0)\right) = E\left(\mathbb{1}_{B}(X_0, X_n)\ |\ \sigma(X_0)\right). $$

According to [1] Theorem 6.3 ("Conditional Distribution") on p. 107, there exists a regular conditional distribution $P\left(X_n \in B\ |\ X_0 = x\right)$. In fact, since $X_0$ and $X_n$ are $P$-independent, then, by the comment in the beginning of [1] p. 107, we may assume that $$ P\left(X_n \in B\ |\ X_0 = x\right) = P_{X_n}(B). $$ Therefore, by [1] Theorem 6.4 ("Disintegration") on p. 108, we have $$ \begin{align} E\left(\mathbb{1}_{B}(X_0, X_n)\ |\ \sigma(X_0)\right) &\overset{P-\mathrm{a.s.}}{=} \left(x \mapsto \int \mathbb{1}_B(x, y)\ P_{X_n}(dy)\right) \circ X_0 \\ &= \left(x \mapsto P_{X_n}\left((-\infty,x)\right) \right)\circ X_0 \\ &= \left(x \mapsto P_{U(0,1)}\left((-\infty,x)\right) \right)\circ X_0 \\ &= \left(x \mapsto \begin{cases}0 & , x \leq 0 \\ x &, 0 < x \leq 1 \\ 1 &, x > 1\end{cases}\right) \circ X_0 \\ &\overset{P-\mathrm{a.s.}}{=} X_0. \end{align} $$

This implies, in turn, $$ \begin{align} P\left(X_n \geq X_0\ |\ \sigma(X_0)\right) &= E\left(1-\mathbb{1}_{B}(X_0, X_n)\ |\ \sigma(X_0)\right) \\ &\overset{P-\mathrm{a.s.}}{=} 1 - E\left(\mathbb{1}_{B}(X_0, X_n)\ |\ \sigma(X_0)\right) \\ &\overset{P-\mathrm{a.s.}}{=} 1 - X_0. \end{align} $$

Q.E.D.


References

[1] Kallenberg, Olav (2001). Foundations of Modern Probability (2nd ed.). York, PA, USA: Springer. ISBN 0-387-95313-2.