Bernoulli process has ith success.

56 Views Asked by At

I have a question about the following problem.

Let $X_{1},...,X_{n}$ be i.i.d. $Bernoulli(p)$. Let $S_{n}=\sum_{i=1}^{n}X_{i}$ and $T_{i}$ is the first time $S_{n}$ has i successes. What are $\mathbb{E}(T_{i}|T_{i-1}),Var(T_{i}|T_{i-1})?$

My attempt:

We know that: $$P(T_{i}|T_{i-1}=k)= \binom{n-k}{1}p(1-p)^{n-k-1}=(n-k)p(1-p)^{n-k-1}.$$

Then, $$ \mathbb{E}(T_{i}|T_{i-1}) =\sum_{T_{i}=k+1}^{n}T_{i}(n-k)p(1-p)^{n-k-1} =(n-k-1)(n-k)p(1-p)^{n-k-1} =(n-k-1)(n-k)p(1-p)^{n-k-1} . $$

Is this right so far? I feel like I did something wrong.

2

There are 2 best solutions below

0
On BEST ANSWER

In general if $X$ and $Y$ are independent then $\mathsf E(X\mid Y)$ is a degenerated random variable with $\mathbb E(X\mid Y)=\mathbb EX$ a.s.

This because in that case for every Borelset $B$: $$\int_{\{Y\in B\}}X(\omega)\mathsf P(d\omega)=\mathsf EX1_B(Y)=\mathsf EX\mathsf E1_B(Y)=\mathsf EX\mathsf P(Y\in B)=\int_{\{Y\in B\}}\mathsf EX\mathsf P(d\omega)$$


Now observe that $T_i=G+T_{i-1}$ where $G\sim\text{Geometric}(p$) and where $G$ and $T_{i-1}$ are independent.

Then we find:$$\mathsf E(T_{i}\mid T_{i-1})=\mathsf E(T_{i-1}+G\mid T_{i-1})=\mathsf E(T_{i-1}\mid T_{i-1})+\mathsf E(G\mid T_{i-1})=T_{i-1}+\mathsf EG=T_{i-1}+p^{-1}$$

where the last equality is based on what is stated above the line.

This tells us that: $$T_i-\mathsf E(T_i\mid T_{i-1})=T_i-T_{i-1}-p^{-1}=G-p^{-1}$$ showing that $(T_i-\mathsf E(T_i\mid T_{i-1})^2$ and $T_{i-1}$ are independent.

Then again applying what is stated above the line we find$$\mathsf{Var}(T_i\mid T_{i-1}):=\mathsf E((T_i-\mathsf E(T_i\mid T_{i-1}))^2\mid T_{i-1})=\mathsf E(G-p^{-1})^2=\mathsf{Var}(G)=\frac{1-p}{p^2}$$

2
On

I believe we should be given an infinite sequence of these iid random variables. Otherwise, $P(T_i = \infty | T_{i-1}=k) > 0$ so expected value is infinite.

Since $T_i$ is the first time $S_n$ takes on value $i$, this forces for all $T_{i-1} < j < T_i$, $X_j = 0$ and $X_{T_i} = 1$. Otherwise you contradict the first time. So,

$$P(T_i = m \ |\ T_{i-1} = k) = p(1-p)^{m-k-1}$$ for $k+1 \leq m$ and $0$ otherwise.

$$E(T_i|T_{i-1}=k) = \sum\limits_{m = k+1}^\infty mp(1-p)^{m-k-1} = \sum\limits_{m=0}^\infty (m+k+1)p(1-p)^m = \frac{1-p}{p} + (k+1)$$

$Var(T_i \ |\ T_{i-1})$ can be computed similarly. Please correct me if I made a calculation error!