Show the conditional probability

284 Views Asked by At

Let $\theta, U_1, U_2,\ldots$ be independent and uniform on $(0,1)$. Let $X_i = 1$ if $U_i\le\theta$ ,$X_i=−1$, if $ U_i >\theta$, and let $S_n = X_1 + \cdots+ X_n$. In words, we first pick $\theta$ according to the uniform distribution and then flip a coin with probability θ of heads to generate a random walk. Compute $\mathbb{P}(X_{n+1} = 1\mid X_1, \ldots,X_n).$

Let $i_1,\ldots,i_{n}\in \left\{-1,1\right\}$ and $N= \text{the cardinality of} \left\{1\le i\le n:X_i=1 \right\}$.$$\mathbb{P}(X_1=i_1,\ldots,X_n=i_n)=\theta^{N}(1-\theta)^{n-N},$$$$\mathbb{P}(X_{n+1} = 1,X_1=i_1,\ldots,X_n=i_n)=\theta^{N+1}(1-\theta)^{n-N},$$ $$\mathbb{P}(X_{n+1} = 1\mid X_1=i_1, \ldots,X_n=i_n)=\theta.$$

However, I have a sense that it may not be correct.

3

There are 3 best solutions below

0
On

We have by the definition for conditional probability,

$$P(X_{n+1}|X_n,\ldots,X_1)=\frac{P(X_{n+1},\ldots,X_1)}{P(X_n,\ldots,X_1)}$$

and by the definitions for marginal probability and conditional independence,

$$P(X_k,\ldots,X_1)=\int_0^1\Big(\prod_{i=1}^kP(X_i|\theta)\Big)P(\theta)d\theta$$

can you take it from here?

0
On

Of course your answer should not depend on $\theta.$ Actually with your good notations

$$\frac{\Pr(X_{n+1}=1, X_n=i_n,\ldots, X_1=i_1)}{\Pr( X_n=i_n,\ldots, X_1=i_1)}=\frac{\int_0^1\theta^{N+1}(1-\theta)^{n-N}d\theta}{\int_0^1\theta^{N}(1-\theta)^{n-N}d\theta}=\frac{N+1}{n+2}$$ using $$B(a,b)=\int_0^{1}x^{a-1}(1-x)^{b-1}dx= \frac{\Gamma(a)\Gamma(b)}{\Gamma(a+b)}.$$

7
On

The $\sigma$-algebra generated by $X_1,\dots,X_n$ is generated by the partition $\left\{\bigcap_{i=1}^n\{U_i\leqslant \theta\}^{\varepsilon_i},\varepsilon_i\in\{0,1\}\right\}$, where $A^1$ denotes $A$ and $A^0$ the complement of $A$. For $\varepsilon=(\varepsilon_i)_{i=1}^n\in\{0,1\}^n$, write $A_\varepsilon=\bigcap_{i=1}^n\{U_i\leqslant \theta\}^{\varepsilon_i}$. Then by the formula giving the conditional expectation of a random variable with respect to a $\sigma$-algebra generated by a partition, one has $$ \mathbb P(X_{n+1}=1\mid X_1,\dots,X_n)=\sum_{\varepsilon\in\{0,1\}^n} \frac{\mathbb P\left( X_{n+1}=1\}\cap A_\varepsilon\right)}{\mathbb P(A_\varepsilon)}\mathbb 1_{A_\varepsilon}. $$ In order to compute the involved probabilities, one can condition on $\theta$ and deduced that these terms depend only on the number of coordinates $i$ for which $\varepsilon_i=1$; we write $c(\varepsilon)$ this number. If $c(\varepsilon)=k$, then using the fomula

$$ \mathbb E\left[g(X,Y)\right]=\int \mathbb E\left[g(X,y)\right]d\mathbb P_{Y}(y), $$ valid for independent random vectors $X$ and $Y$, one gets \begin{align} \mathbb P\left( X_{n+1}=1\}\cap A_\varepsilon\right)&=\int_0^1\mathbb P\left( \{U_{n+1}\leqslant t\}\cap \bigcap_{i=1}^k\{U_i\leqslant t\}\cap\bigcap_{i=k+1}^n \{U_i> t\}\right)dt\\ &=\int_0^1t^{k+1}(1-t)^{n-k}dt \end{align} and similarly, $\mathbb P(A_\varepsilon)=\int_0^1t^k(1-t)^{n-k}dt$. The formula mentioned by Gerard Letac allows to simplify the ratio $\mathbb P\left(\{ X_{n+1}=1\}\cap A_\varepsilon\right)/\mathbb P\left( A_\varepsilon\right)$. Finally, letting $B_k:=\bigcup_{\varepsilon: c(\varepsilon)=k}A_\varepsilon$ gives an answer expressed as a linear combination of indicators of $B_k$.