Stochastic variables independent given Tau

162 Views Asked by At

Say we have a filtration $(\mathbb{F}_s)$, and a stopping time $\tau$ w.r.t. to that filtration.Let $X_t$ be a continuous stochastic process (not required to be adapted to the mentioned filtration), such that $X_t$ is independent of $(\mathbb{F}_s)_{s\leq s_0}$ for $s_0\leq t$

I strongly believe that it holds that for all $c>0$ (correct me if its not intuitively clear) $$P(X_{\tau}>c\vert \tau=s)=P(X_{s}>c).$$ or atleast $$P(X_{\tau+\epsilon}> c\vert \tau=s)=P(X_{s+\epsilon}> c).$$ Can anybody help me prove it?

What i have tried: Well in discrete time the proof of the 2nd statement is straight forward $$P(X_{\tau+1}>c\vert \tau=j)=\dfrac{P(X_{\tau+1}>c, \tau=j)}{P(\tau=j)}=\dfrac{P(X_{j+1}>c, \tau=j)}{P(\tau=j)}=P(X_{j+1}>c).$$ I have troubles making it work in continuous time. Our definition of $P(X_{\tau}>c\vert \tau=s)$ is the function $\phi(s)$ such that $\phi(\tau)$ is a.s the random variable $E(1_{(X_{\tau}>c)}\vert \tau).$

2

There are 2 best solutions below

2
On BEST ANSWER

Consider $$\tau_n = \frac{[2^n\tau] + 1}{2^n}$$ Note that $\tau_n$ is discrete and $\tau_n \downarrow \tau$.

Now use the result you have for the discrete cases $$P(X_{\tau_n}>c\vert \tau_n=j)=\dfrac{P(X_{\tau_n}>c, \tau_n=j)}{P(\tau_n=j)}=\dfrac{P(X_{j}>c, \tau_n=j)}{P(\tau_n=j)}=P(X_{j}>c). \quad j \in \Bbb{Z}_+/ 2^n$$

Now to the continous case note that since $X_t$ is continuous $X_{\tau_n} \to X_\tau$ and $$ 1_{[X_{\tau} > c]} = \lim_n 1_{[X_{\tau_n} > c]} $$

Therefore $$ E[1_{[X_{\tau_n} > c]} \mid \tau] = \sum_{j \in \Bbb{Z}_+/2^n} P(X_{j}>c)1_{[2^n \tau] + 1 = j} = g^n(\tau)$$

Assume X is bounded, therefore by the bounded convergence theorem

$$ E[1_{[X_{\tau} > c]} \mid \tau] = \lim_n E[1_{[X_{\tau_n} > c]} \mid \tau] = \lim_n \sum_{j \in \Bbb{Z}_+/2^n} P(X_{j}>c)1_{[2^n \tau] + 1 = j} = \lim_n g^n(\tau) = g(\tau) $$

as for each $t$ $g(t) = \lim_n g^n(t) = P(X_t > c)$ is the limit of $\mathcal{F}_\tau$ measurable functions, the limit is also $\mathcal{F}_\tau$, and the claim follows.

To the general case (when $X$ is unbounded) note by $X^M = (X \wedge M) \vee 0$. $$X^M =\begin{cases}M & \text{ if } X>M \\X & \text{ if } 0<X\leq M \\0 & \text{ if } X\leq 0 \\\end{cases}$$

Now use the monotone congergence theorem. $$ E[1_{[X_{\tau} > c]} \mid \tau] = \lim_M E[1_{[X^M_{\tau} > c]} \mid \tau] = \lim_M P(X^M_{\tau} > c) = P(X_\tau> c) $$

3
On

I am not sure I understand your notation, and don't generally understand a lot of notation from stochastic analysts but would like to improve so I'll give it a shot.

Let $(\Omega,\mathbb{F})$ be the probability space and filtration, and $g_t(\omega)=1_{X_t(\omega)>c}$. The question, interpreted into this abstract notation, is to show that $E(g_\tau|\tau)(\omega)=E(g_s)$ whenever $\tau(\omega)=s$, assuming that $X_t$ is independent of $\mathbb{F}_t$ for all $t$.

According to Wikipedia's page on conditional expectation, $E(g_\tau|\tau)=E(g_\tau|\mathcal{H})$ where the $\sigma$-algebra is $\mathcal{H}=\{\tau^{-1}(B):B\subset \mathbb{R}^+\ {\rm is\ Borel}\}$. Let $\mathcal{H}_t\subset \mathbb{F}_t$ be the restriction for $B\subset [0,t]$. Independence implies that $E(g_s|\mathcal{H_s})=E(g_s)$. When restricted to $\{\omega:\tau(\omega)=s\}=O\in \mathcal{H}_s$ we have that $E(g_s|\mathcal{H}_s)=E(g_\tau|\mathcal{H}_s)$. Still restricting to $O$, $E(g_\tau|\mathcal{H}_s)=E(g_\tau|\mathcal{H})$, because $\{A\cap O: A\in \mathcal{H}_s\}=\{A\cap O: A\in \mathcal{H}\}$.

I think the notation is a pain and maybe I miss the point of the question. It does not seem to need any continuity, although continuity might help if you only had independence of $X_t$ and $\mathbb{F}_s$ for $s<t$.