Please tap image twice to read it.
Why is variance: $$\sum_{i=1}^{m}\sum_{j=1}^{m}t_{i}t_{j}Cov(X_i,X_j)$$
Can someone derive this? Why dont they use the variance $$\sigma_{Y}^{2}=\sum_{i=1}^{n}\alpha_{i}^{2}\sigma_{i}^{2}$$
That is derived here:
Please tap image twice to read it.
Why is variance: $$\sum_{i=1}^{m}\sum_{j=1}^{m}t_{i}t_{j}Cov(X_i,X_j)$$
Can someone derive this? Why dont they use the variance $$\sigma_{Y}^{2}=\sum_{i=1}^{n}\alpha_{i}^{2}\sigma_{i}^{2}$$
That is derived here:
Copyright © 2021 JogjaFile Inc.
Recall that
$$ {\rm cov}(X, Y) = \mathbb{E}[(X - \mathbb{E}[X])(Y-\mathbb{E}[Y])] $$
so that
$$ {\rm cov}(X, X) = \mathbb{E}[(X - \mathbb{E}[X])^2] = {\rm Var}(X) $$
Therefore, if you call $Y = \sum_i t_i X_i$ then
\begin{eqnarray} {\rm Var}(Y) &=& {\rm cov}(Y, Y) = {\rm cov}\left( \sum_i t_i X_i, \sum_j t_j X_j\right) &=& \sum_{i,j}t_it_j{\rm cov}(X_i,X_j) \tag{1} \end{eqnarray}
With this being said, remember that for any two r.vs $X$ and $Y$
$$ {\rm Var}(X, Y) = {\rm Var}(X) + {\rm Var}(Y) + \color{red}{{\rm cov}(X,Y)} $$
Or in general
$$ {\rm Var}\left(\sum_i t_i X_i\right) = \sum_{i}t_i^2{\rm Var}(X_i) + \sum_{i\not= j}t_it_j \color{red}{{\rm cov}(X_i,X_j)} \tag{2} $$
which is exactly the same result as in (1). The problem with your last expression is that you are neglecting the covariance (red term in the equations above)