Prooving independece of random variables defined on Hilbert space

58 Views Asked by At

I have a problem with one proof concerning my thesis on Gaussian measures on Hilbert spaces.

So we have separable Hilbert space $H$, Borel $\sigma$-algebra $\mathscr{B}(H)$ and Gaussian measure $\mu$, defined so that its Fourier transform is $$\widehat{\mu}(h) = e^{i \langle m, h \rangle} e^{-\frac{1}{2} \langle Qh,h \rangle},$$ where $m$ and $Q$ are mean and covariance operator respectively. Q is compact positive and symetric so $Qe_k = \lambda_k e_k$ holds, where $e_k$ is some ortonormal basis of $H$.

Let's first denote $x_k = \langle x, e_k \rangle, x \in H$. Now we have for $\forall n \in \mathbb{N}$, function $P_n(x) = \sum_{k=1}^{n} x_k e_k$. Then we identify $P_n(H)$ with $\mathbb{R}^n$ throug isomorphism $P_n(x) \mapsto (x_1, ..., x_n).$ So if I understand right $Pn$ is random variable which takes some element from H and maps it to $P_n(H)$ which is some kind of projection of H on $n-$dimensional vector space. Since we identify $P_n(H)$ with $\mathbb{R}^n$ throug isomorphism we can look on $P_n$ as it was $\mathbb{R}^n$-valued random variable (Not 100% sure about this conclusion, so correct me if I'm wrong).

My task is to proove that law of $P_n$ which is denoted by $\mu_n$ is equal to $$\mu_n =\times_{i = 1} ^n N_{m_k,\lambda_k},$$ where $N_{m_k,\lambda_k}$ is Gaussian measure in $\mathbb{R}$, so $ N_{m,\lambda}(B) = \frac{1}{\sqrt{2\pi\lambda}} \int_B e^{-\frac{(x-m)^2}{2\lambda}} dx$.

The part of proof that I can't figure out is following. I want to show that $x_ie_i$ is independent of $x_j e_j$ for all $i \ne j$. It seems so intuative but I just can't proove it rigurosly. Also if I persume this is true, I can work through the rest of the proof, so this is only part where I need help. Thank you very much in advance.