Compute $\operatorname{cov}\left(\sum_{i=1}^{N}X_i,\sum_{i=1}^{N}Y_i\right)$

132 Views Asked by At

Let $X_1,X_2,\ldots$ be i.i.d. Bernoulli random variables with parameter $\frac{1}{4}$. Let $Y_1,Y_2,\ldots$ be another sequence of i.i.d. Bernoulli random variables with parameter $\frac{3}{4}$.Let $N$ be a geometric random variable with parameter $\frac{1}{2}$, i.e. $P(N=k)=\frac{1}{2^k}$ where $k=1,2,\ldots$. Assume the $X_i's,Y_j's$ and $N$ are all independent. Compute $\operatorname{cov}\left(\sum\limits_{i=1}^{N}X_i,\sum\limits_{i=1}^{N}Y_i\right)$.

My try:

See that $\sum X_i\mid N=n \sim \mathsf{Bin}(n,\frac{1}{4})$ and $\sum Y_i\mid N=n \sim \mathsf{Bin}(n,\frac{3}{4})$.

Now, by law of total expectation, $E\left(\sum_\limits{i=1}^{N} X_i\right)=\frac{1}{2}$ and $E\left(\sum\limits_{i=1}^{N} Y_i\right)=\frac{3}{2}$.

But how can I find $E\left(\sum\limits_{i=1}^{N} X_i \sum\limits_{i=1}^{N} Y_i\right)$?

1

There are 1 best solutions below

2
On BEST ANSWER

By the same way, law of total expectation gives that $$\mathbb E\left(\sum_{i=1}^{N} X_i\cdot \sum_{i=1}^{N} Y_i\right)=\mathbb E\left(\mathbb E\left(\sum_{i=1}^{N} X_i \cdot \sum_{i=1}^{N} Y_i\biggm| N\right)\right)$$ Since given $N$ both sums are independent, one can write this expression as $$ \mathbb E\left(\sum_{i=1}^{N} X_i\cdot \sum_{i=1}^{N} Y_i\right)=\mathbb E\left(\mathbb E\left(\sum_{i=1}^{N} X_i \biggm | N\right) \cdot \mathbb E\left(\sum_{i=1}^{N} Y_i\biggm| N\right)\right) $$ Next use that given $N$, both sums has binomial distributions and get $$ \mathbb E\left(\sum_{i=1}^{N} X_i\cdot \sum_{i=1}^{N} Y_i\right)=\mathbb E\left( \frac14N\cdot \frac34N\right)=\frac9{16}\mathbb E(N^2)=\frac{9}{16}\left(\text{Var}(N)+(\mathbb E(N))^2\right)=\frac{9}{8}.$$ Finally $$ \textrm{cov}\left(\sum_{i=1}^{N} X_i, \sum_{i=1}^{N} Y_i\right)=\frac{9}{8}-\frac12\cdot\frac32=\frac38. $$