orthogonal sequences of random variables independence proof

89 Views Asked by At

Question is here

It is independence that wants to be shown, isn't it? In which way should I follow to proof this independence?

1

There are 1 best solutions below

5
On BEST ANSWER

$\newcommand{\E}{\mathbb{E}}$Use linearity of expectation, and the fact that the variables are uncorrelated, to compute this directly: $$\E[ZW]=\E\left[(a_1X_1 +\dots a_kX_k)\cdot(a_{k+1}X_{k+1} + \dots a_n X_n) \right] \\= \E [a_1a_{k+1}X_1 X_{k+1}+ \dots a_1a_nX_n + a_2a_{k+1}X_2X_{k+1}+\dots+a_ka_nX_kX_n] \\ = a_1a_{k+1}\E[X_1X_{k+1}]+ \dots+ a_ka_n\E[X_kX_n] = 0+\dots0=0, $$ since by assumption $\E[X_iX_j]=0$ whenever $i \not=j$ (and this is the case for all products occurring in $ZW$ because the indices in $Z$ range from $1$ to $k$ and those in $W$ range from $k+1$ to $n$).

Thus the equality $\E[ZW]$ holds as long as $\E[Z]=0$ or $\E[W]=0$ (because then $\E[Z]\E[W]=0$. Do you think you can take it from there?