Markov property of the sum of two independent random variables

139 Views Asked by At

If $X$ and $Y$ are $d$-dimensional random vectors on $\left(\Omega,\mathscr{F},P\right)$, $\mathscr{G}$ is a sub-$\sigma$-field of $\mathscr{F}$, $X$ is independent of $\mathscr{G}$ and $Y$ is $\mathscr{G}$-measurable, then for every $\Gamma\in\mathscr{B} \left(\mathbb{R}^{d}\right):$

$P\left[X+Y\in\Gamma|\mathscr{G}\right]=P\left[X+Y\in\Gamma|Y\right],\quad\text{a.e.}P$

$P\left[X+Y\in\Gamma|Y=y\right]=P\left[X+y\in\Gamma\right],\quad\text{for }PY^{-1}\text{a.e.}y\in\mathbb{R}^{d}$

This is the problem of Karatzas&Shreve Brownian Motion and Stochastic Calculus Problem 2.5.9.

Following the answer to this problem, I can complete the proof of the first equation. But I am stuck with the second equation. For completeness, I post the solution to the first equation.

Let us first show that for $D\in\mathscr{B}\left(\mathbb{R}^{d}\times\mathbb{R}^{d}\right)$, we have $P\left[\left(X,Y\right)\in D|\mathscr{G}\right]=P\left[\left(X,Y\right)\in D|Y\right]$

If $D=B\times C$, where $B,C\in\mathscr{B}\left(\mathbb{R}^{d}\right)$, then

$E\left[\chi_{\left\{ X\in B\right\} }\chi_{\left\{ Y\in C\right\} }|\mathscr{G}\right] =\chi_{\left\{ Y\in C\right\} }E\left[\chi_{\left\{ X\in B\right\} }|\mathscr{G}\right] =\chi_{\left\{ Y\in C\right\} }P\left(X\in B\right)$

where $E\left[\chi_{\left\{ X\in B\right\} }\chi_{\left\{ Y\in C\right\} }|Y\right] =\chi_{\left\{ Y\in C\right\} }E\left[\chi_{\left\{ X\in B\right\} }|Y\right] =\chi_{\left\{ Y\in C\right\} }P\left(X\in B\right)$

so $P\left[\left(X,Y\right)\in D|\mathscr{G}\right]=P\left[\left(X,Y\right)\in D|Y\right]$ holds for this special case. The sets $D$ for which $P\left[\left(X,Y\right)\in D|\mathscr{G}\right]=P\left[\left(X,Y\right)\in D|Y\right]$ holds form a $\lambda$ system containing all measurable rectangles, so $P\left[\left(X,Y\right)\in D|\mathscr{G}\right]=P\left[\left(X,Y\right)\in D|Y\right]$ holds for every $D\in\mathscr{B}\left(\mathbb{R}^{d}\times\mathbb{R}^{d}\right)$. Let $D=\left\{ \left(x,y\right):x+y\in\Gamma\right\}$ .Note that $D\in\mathscr{B}\left(\mathbb{R}^{d}\times\mathbb{R}^{d}\right)$ since $f\left(x,y\right)=x+y$ is continuous from $\mathbb{R}^{d}\times\mathbb{R}^{d}$ to $\mathbb{R}^{d}$, hence $f$ is $\mathscr{B}\left(\mathbb{R}^{d}\times\mathbb{R}^{d}\right)/\mathscr{B}\left(\mathbb{R}^{d}\right)$-measurable. Then we have

$P\left[X+Y\in\Gamma|\mathscr{G}\right]=P\left[X+Y\in\Gamma|Y\right],\quad\text{a.e.}P$

Karatzas&Shreve mentions that we can proof the second equation using a similar proof. But I do not know what to do. I think I have difficulty in dealing with the conditional expectation $Y = y$. Can anyone help me? Thanks a lot in advance.

2

There are 2 best solutions below

2
On BEST ANSWER

The mapping $g(y) := \mathbb{P}(X+Y \in \Gamma \mid Y=y)$ is the unique (up to $\mathbb{P}_Y$-null sets) mapping such that $$\mathbb{P}(X+Y \in \Gamma \mid Y) = g(Y).$$

By the definition of the conditional expectation, this is equivalent to

$$\forall B \in \mathcal{B}(\mathbb{R}): \quad \int 1_B(Y) 1_{\Gamma}(X+Y) \, d\mathbb{P} = \int 1_B(Y) g(Y) \, d\mathbb{P} .$$

Let's start with the expression on the left-hand side. We have

$$ \int 1_B(Y) 1_{\Gamma}(X+Y) \, d\mathbb{P} = \int_{\mathbb{R}^2} 1_B(y) 1_{\Gamma}(x+y) \, d\mathbb{P}_{X,Y}(x,y)$$

where $d\mathbb{P}_{X,Y}$ denotes the joint distribution of $(X,Y)$. Since $X$ and $Y$ are independent, it holds that $d\mathbb{P}_{X,Y}(x,y) = d\mathbb{P}_X(x) d\mathbb{P}_Y(y)$, and so

\begin{align*} \int 1_B(Y) 1_{\Gamma}(X+Y) \, d\mathbb{P} &= \int_{\mathbb{R}} \int_{\mathbb{R}} 1_B(y) 1_{\Gamma}(x+y) \, d\mathbb{P}_X(x) \, d\mathbb{P}_Y(y) \\ &= \int_{\mathbb{R}} 1_B(y) \underbrace{\mathbb{E}(1_{\Gamma}(X+y))}_{\mathbb{P}(X+y \in \Gamma)} \, d\mathbb{P}_Y(y). \end{align*}

If we set $f(y) := \mathbb{P}(X+y \in \Gamma)$, then this gives

$$ \int 1_B(Y) 1_{\Gamma}(X+Y) \, d\mathbb{P} = \int 1_B(Y) f(Y) \, d\mathbb{P}.$$

By the considerations at the very beginning of this answer, this means that $$f(y) = \mathbb{P}(X+Y \mid Y=y),$$ i.e. $$\mathbb{P}(X+Y \in \Gamma \mid Y=y) = \mathbb{P}(X+y \in \Gamma).$$

0
On

The second equation follows from the following: if $(U,V)$ is such that the random $d$-dimensional vectors $U$ and $V$ are independent, then for all measurable function $f\colon \mathbb R^d\times\mathbb R^d\to\mathbb R$, $$ \mathbb E\left[f\left(U,V\right)\mid V\right]=g(V), $$ where $g(v)=\mathbb E\left[f\left(U,v\right)\right]$.