Proving relation between probability with random variables and variance

62 Views Asked by At

Let $X_1,\ldots,X_n$ be random variables which are pairwise independent.

I have already proven these two equalities in earlier parts of this problem and now I am trying to answer the quoted part. I'm lost on what c is supposed to be and whether/how the two earlier parts factor into this last part. Any help is appreciated, thanks. $$\operatorname{Var}\left(\sum^n_{i=1}X_i\right)= \sum^n_{i=1} \operatorname{Var}(X_i)$$ $$E\left(\left(\sum^n_{i=1}X_i\right)^2\right)=\sum^n_{i=1}E(X_i^2) + \sum_{i\not= j} E(X_i)\cdot E(X_j)$$

Conclude that if $X_1, \ldots , X_n$ all have expectation $\mu$ and variance $\sigma^2$, then for $X=\sum^n_{i=1}X_i$ $$P(|X-n\mu|\geq c) \leq \frac{n\sigma^2}{c^2}$$

1

There are 1 best solutions below

0
On BEST ANSWER

The identity for $E( \sum\limits_{i=1}^{n}X_i)^{2}$ follows by simply expanding the square and using the fact that $EX_iX_j=EX_iEX_j$ for $ i \neq j$ (by pair-wise independence).

Now $var(\sum\limits_{i=1}^{n}X_i)=E( \sum\limits_{i=1}^{n}X_i)^{2} - (E( \sum\limits_{i=1}^{n}X_i))^{2}$. Use the previous identity and the fact that $E( \sum\limits_{i=1}^{n}X_i)=\sum\limits_{i=1}^{n}EX_i$ to show that $var (\sum\limits_{i=1}^{n}X_i)=\sum\limits_{i=1}^{n}var (X_i)$.

The last part is now immediate from Chebyshev's inequality.