Exercise 2.3.29 from Stroock's "Probability Theory: an Analytic View"

87 Views Asked by At

I am working on Exercise 2.3.29 from Stroock's "Probability Theory: an Analytic View".

Let $X$ be a multivariate normal random variable defined on a probability space $(\Omega, F, P)$ with mean $0$ and covariance matrix $C$.

For any linear subspace $L$ of $\mathbb{R}^N$, write $F_L$ for the sub$\sigma$-algebra generated by random variables $\langle \xi, X \rangle$ where $\xi$ ranges over $L$.

Now let a (nonzero) linear subspace $L$ of $\mathbb{R}^N$ be given and fixed. Write $L'$ for the set of all $\eta \in \mathbb{R}^N$ such that $\langle \eta, C\xi \rangle = 0$ holds for every $\xi \in L$. Clearly $L'$ is a linear subspace. I am asked to show that $F_L$ and $F_{L'}$ are independent $\sigma$-algebras.

Both $\sigma$-algebras are generated by random variables. It suffices to let a generating random variable of each $\sigma$-algebra be given and show that they are independent. To this end, let $\xi \in L$ and $\eta \in L'$ be given. We show that $\langle \xi, X \rangle$ and $\langle \eta, X \rangle$ are independent random variables. It suffices to look at the characteristic functions and show that, for any $t \in \mathbb{R}$ and $s \in \mathbb{R}$, we have:

\begin{equation*} \mathbb{E}[e^{it \langle \xi, X \rangle}e^{is \langle \eta, X \rangle}] = \mathbb{E}[e^{it \langle \xi, X \rangle}] \mathbb{E}[e^{is \langle \eta, X \rangle}] \end{equation*}

It is not clear to me how this holds. Write $\gamma_{0, 1}$ for standard normal probability measure on $\mathbb{R}$ and $\gamma_{0, C}$ for the law of $X$. The left hand side equals to:

\begin{equation*} \mathbb{E}[e^{it \langle \xi, X \rangle}e^{is \langle \eta, X \rangle}] = \int_{\mathbb{R}^N} e^{it \langle \xi, x \rangle}e^{is \langle \eta, x \rangle} d\gamma_{0, C}(x) = \int_{\mathbb{R}^N} e^{it \langle \xi, C^{\frac{1}{2}}x \rangle}e^{is \langle \eta, C^{\frac{1}{2}}x \rangle} d\gamma_{0, 1}^N(x) \end{equation*}

I failed to reach the right hand side by playing with the transpose.

1

There are 1 best solutions below

1
On BEST ANSWER

Note that for any real $a,b$, $$a\langle \xi, X\rangle + b\langle \eta, X\rangle = \langle a\xi+b\eta, X\rangle$$ is normal since $X$ is multivariate normal, so $\langle \xi, X\rangle$ and $\langle \eta, X\rangle$ are jointly normal. It follows they are independent if and only if they are uncorrelated. As each has mean $0$, we just need to show their product has mean $0$.

Then taking the expectation of both sides of $$\langle \eta, X\rangle\langle \xi, X\rangle=\langle\eta,(XX^T)\xi\rangle$$ yields, by linearity, $$\mathbb{E}[\langle \eta, X\rangle\langle \xi, X\rangle] = \langle\eta,\mathbb{E}[XX^T]\xi\rangle = \langle\eta,C\xi\rangle = 0$$