Conditional expectation and Bernoulli process

188 Views Asked by At

I'm struggling with an exercise in my Probability Modelling course. Let me explain you the context:

  • Let $X_i, i = 1, 2, ...$ be i.i.d. Bernoulli random variables B(1, p), that is, $P(X_i = 1) = p$ and $P(Xi = 0) = 1 - p, 0 < p < 1$.

  • Define a stochastique process $(Yt)_{t\in N}$ by $Y_0 = 0, \; Y_t = Y_{t-1} + X_t, \; t = 1, 2, ...$

So, for the 1st question, I had to give the distribution of $Y_t$ and $E[Y_t]$

  • So far, I've understood since $Y_t=\sum_{i=1}^t X_i \sim Bin(t,p)$ (biniomial distribution).
  • Distirbution: $P(Y_t=k)=\binom{t}{k}p^k(1-p)^{t-k}$
  • $E[Y_t]=t\times p$

Now the 2nd and 3rd question:

I need to calculate:

  • $E[Y_t \vert Y_{t-1}]$;
  • $s \le t \;: E[Y_t \vert Y_s] \;and\; E[Y_s \vert Y_t]$

And from this point, I cannot find a proper solution. I've tried this but I don't how to continue

$E[Y_t \vert Y_{t-1}] = \sum^t_{k=1}k\times P[Y_t=k \vert Y_{t-1}=k]$=$\sum^t_{k=1}k\times \frac{P[Y_t=k , Y_{t-1}=k]}{P[Y-{t-1}=k]}$

I spend time searching on internet but I have the feeling that I'm missing something. Thank you in advance for everyone trying to help me.

1

There are 1 best solutions below

2
On

First verify (by induction) that $Y_t= \sum\limits_{i=1}^{t}X_i$. Use properties of conditional expectations to get your answers.

$E(Y_t|Y_{t-1})=E(\sum\limits_{i=1}^{t-1}X_i+X_t|Y_{t-1})=Y_{t-1}+EX_t=Y_{t-1}+p$

For $s \leq t$, $E(Y_t|Y_s)$ can be obtained in a similar manner.

For the last part use the fact that $(X_i)$ is i.i.d. to conclude that $E(X_i|X_1+X_2+\cdots+X_t)$ is the same for all $i \leq t$. Hence, each one of them is $\frac 1 n$ times their sum, which means $E(X_i|X_1+X_2+\cdots+X_t)=\frac 1 n Y_t$. Hence, (summing over $i$ from $1$ to $s$) $E(Y_s|Y_t)=\frac s n Y_t$.