Conditional expectation of composition of independent stochastic processes

322 Views Asked by At

I am searching for a rigeros proof of an assertion I've read a few times now.

Assume two independent (real-valued) stochastic processes $(P(t),~t\in \mathbb{R}_+)$ and $(T(t),~t\in \mathbb{R}_+)$ are given with $T(t)\in\mathbb{R}_+$, $\mathbb{P}$-a.s. for all $t\in \mathbb{R}_+$. Claim: In this case it holds:

$\mathbb{E} (P(T(t)) \vert T(t)) = \mu_P(T(t))$ for all $t\in \mathbb{R}_+$,

where $\mu_P(s):=\mathbb{E}(P(s))$ for all $s\in \mathbb{R}_+$.

Since (under reasonable assumptions) the random variable $\mu_P(T(t))$ is $\sigma(T(t))$-measurable, it remains to show that for every set $A\in \sigma(T(t))$ it holds that

$\mathbb{E}({1}_A \mu_P(T(t))) = \mathbb{E}({1}_AP(T(t)))$.

I have problems to find the right way to verify this equation. Does someone have an idea?

Thank you very much in advance!

1

There are 1 best solutions below

2
On BEST ANSWER

Any $A \in \sigma(T(t))$ can be written in the form $A=\{T(t) \in B\}$ for some measurable set $B$. Consequently, we need to check

$$\mathbb{E}\bigg[ 1_B(T(t)) \mu_P(T(t)) \bigg] = \mathbb{E} \bigg[ 1_B(T(t)) P(T(t)) \bigg] \tag{1}$$

for any measurable set $B$. We will use the following lemma

Let $(X_t)_{t \geq 0}$ be a stochastic process with continuous sample paths. If $S$ is an independent random variable which takes values in $[0,\infty)$, then $$\mathbb{E}(f(X(S),S)) = \mathbb{E}(h(S))$$ where $$h(s) := \mathbb{E}(f(X(s),s))$$ and $f$ is a "nice" continuous function. ("Nice" means that everything is suitably integrable.)

Proof: Assume first that $S$ takes only countably many values, i.e. $S(\Omega) \subseteq \{s_k; k \geq 1\}$ for a suitable sequence $(s_k)_{k \geq 1}$. Then $$\mathbb{E}f(X(S),S) = \sum_{k \geq 1} \mathbb{E}(f(X(S),S) 1_{\{S=s_k\}}) = \sum_{k \geq 1} \mathbb{E}(f(X(s_k),s_k) 1_{\{S=s_k\}}).$$ Since $S$ and $(X_t)_{t \geq 0}$ are independent, we get $$\mathbb{E}(f(X(S),S) = \sum_{k \geq 1} \mathbb{P}(S=s_k) \mathbb{E}(f(X(s_k),s_k)) = \sum_{k \geq 1} \mathbb{P}(S=s_k) h(S) = h(S)$$ with $h(s) := \mathbb{E}f(X(s),s)$.

Now consider the general case, i.e. that the range of $S$ is not necessarily countable. Define $$S_n := \sum_{k \geq 0} (k 2^{-n}) 1_{\{k 2^{-n} \leq S < (k+1) 2^{-n}\}}.$$ Then each $S_n$ is independent from $(X_t)_{t \geq 0}$, has countable range and moreover $S_n \uparrow S$ as $n \to \infty$. Using the continuity of the sample paths and the first part of the proof, we get \begin{align*} \mathbb{E}f(X(S),S) = \lim_{n \to \infty} \mathbb{E}(f(X(S_n),S_n) = \lim_{n \to \infty} \mathbb{E}(f(X(s),s)) \bigg|_{s=S_n} = h(S). \end{align*}

Remark: The proof works more generally for left-continuous processes and (with a slight modification) also for processes with right-continuous sample paths.


Applying the above lemma, we get

$$\mathbb{E}\bigg[ g(T(t)) \mu_P(T(t)) \bigg] = \mathbb{E} \bigg[ g(T(t)) P(T(t)) \bigg] \tag{2}$$

for any continuous bounded function $g$. For any open set $U \subset \mathbb{R}$ there is, by Urysohn's lemma, a sequence $(g_k)_{k \geq 1} \subset C_b$ with $g_k \uparrow 1_U$. Taking the limit in $(2)$ (with $g$ replaced by $g_k$) we obtain that

$$\mathbb{E}\bigg[ 1_U(T(t)) \mu_P(T(t)) \bigg] = \mathbb{E} \bigg[ 1_U(T(t)) P(T(t)) \bigg] \tag{3}$$

Finally, since the open sets are a generator of the Borel sets, it follows from a standard monotone class argument that

$$\mathbb{E}\bigg[ 1_B(T(t)) \mu_P(T(t)) \bigg] = \mathbb{E} \bigg[ 1_B(T(t)) P(T(t)) \bigg] \tag{4}$$

for any Borel set $B$.