According to wiki:
$$\text{Var}\left( \sum_{i=1}^n X_i \right) = \sum_{i=1}^n \sum_{j=1}^n \text{Cov}(X_i, X_j)$$
I can't figure out why it's true ? (I can't prove it)
- How can we prove it ?
- Is there a simple example which will help me to understand this equation ?
This can be proved using the definition of variance and covariance.
Using the definition of variance and the linearity of the expected value you have $$Var\left(\sum_{i=1}^n X_i\right)=\mathbb{E}\left[\left(\sum_{i=1}^nX_i-\mathbb{E}\left[\sum_{i=1}^nX_i\right]\right)^2\right] = \mathbb{E}\left[\left(\sum_{i=1}^n(X_i-\mathbb{E}\left[X_i\right])\right)^2\right].$$ Let's define $a_i=X_i-\mathbb{E}\left[X_i\right].$ Expanding the right hand side of the previous equation you get $$\mathbb{E}\left[\left(\sum_{i=1}^na_i\right)^2\right]=\mathbb{E}\left[\sum_{i=1}^n\sum_{j=1}^n a_ia_j\right]=\sum_{i=1}^n\sum_{j=1}^n\mathbb{E}\left[ (X_i-\mathbb{E}\left[X_i\right])(X_j-\mathbb{E}\left[X_j\right])\right],$$ which is what you were after.
I am not sure if you can view the equality in a more intuitive way, but the result comes from the the linearity of the expected value and the expansion of the square of a sum.
Maybe it helps to be a bit more explicit in the case $n=2$. $$Var\left(X_1 + X_2\right)=\mathbb{E}\left[\left(X_1 + X_2-\mathbb{E}\left[X_1+X_2\right]\right)^2\right]=\mathbb{E}\left[\left(X_1 + X_2-\mathbb{E}\left[X_1\right]-\left[X_2\right]\right)^2\right].$$ Now you have to expand the square of the sum. One way to go about it is expanding it term by term and trying to see if you can play with the expression you obtain.
Alternatively, it can be noticed that $X_1 -\mathbb{E}\left[X_1\right]$ is the same term that appears in the definition of $Var(X_1)$, so maybe it is not such a bad idea to write the last term as $$\mathbb{E}\left[\left((X_1-\mathbb{E}\left[X_1\right])+(X_2-\left[X_2\right])\right)^2\right].$$ Expanding the square now you can easily guess/get the result.