Why is this conditional expectation equal to this probability?

87 Views Asked by At

From Stochastic Processes by Ross

On page 75 (in the proof of lemma 2.3.4 on the bottom of page 74), the author writes at the bottom of the page:

$P(Y_1 + \dots + Y_k < \tau_k, k = 1, \dots, n | \tau_n = u, Y_1 + \dots Y_n = y) = E[1 - \frac{Y_1 + \dots + Y_{n-1}}{\tau_n} | \tau_n = u, Y_1 + \dots + Y_n = y]$

I can't figure out why this is the case, anyone have any ideas?


It seems the author is proving this by induction, but I can't figure out how these two things are equal. He shows that:

$P(Y_1 + \dots Y_k < \tau_k, k = 1, \dots, n |Y_1 + \dots + Y_{n-1}, \tau_n u, Y_1 + \dots + Y_n) = 1-\frac{Y_1 + \dots + Y_{n-1}}{\tau_n}$

but I can't figure out how to prove that

$E[P(Y_1 + \dots Y_k < \tau_k, k = 1, \dots, n |Y_1 + \dots + Y_{n-1}, \tau_n = u, Y_1 + \dots + Y_n) \space | \space \tau_n = u, Y_1 + \dots + Y_n = y]$ $= P(Y_1 + \dots + Y_k < \tau_k, k = 1, \dots, n | \tau_n = u, Y_1 + \dots Y_n = y)$

The lemma restated here is:

Let $\tau_1, \dots, \tau_n$ denote the ordered values from a set of $n$ independent uniform $(0,1)$ random variables and let $Y_i, \dots, Y_n$ be iid nonnegative random variables that are also independent of $\{\tau_1, \dots, \tau_n\}.$ Then $P(Y_1 + \dots Y_k < \tau_k, k = 1, \dots, n | Y_1 + \dots + Y_n) = 1-y/t$ for $0 < y < t$ and $0$ otherwise.

1

There are 1 best solutions below

0
On

Figured it out.

If $g(B) = P(A|B, C=c)$ then $E(g(B)|C=c)= P(A|C=c)$.