I want to calculate the variance of a sum of random variables $ (X_1+X_2) $.
Doing statistics, I have to learn maths starting from the end and it is quite difficult, yet very interesting (please consider that I only have the very basic skills in maths).
For now I am doing this calculus manually with the formula $ \mathrm{var}(X_1+X_2) = \mathrm{var}(X_1) + \mathrm{var}(X_2) + 2\mathrm{cov}(X_1,X_2)$.
But I am now facing much larger sums (with some minus) and being able to do so with matrix calculation would save me a lot of time (and would be very satisfying too).
I searched every resource in matrix calculus but couldn't find anything usable with my knowledge.
How can I do this calculus from the variance-covariance matrix $$ \begin{pmatrix} \mathrm{var}(X_1) & \mathrm{cov}(X_1,X_2) \\ \mathrm{cov}(X_1,X_2) & \mathrm{var}(X_2) \\ \end{pmatrix} $$ [preferentially extended to substractions and n terms, like $ (X_1+X_2-X_3) $]?
NB: this is not a statistic question and doesn't belong to stats.stackexchange. I want to understand the thought process of turning scalar calculation to matrix.
The variance-covariance matrix of $X$ is $\frac1n(X-\bar X)^T(X-\bar X)$.
Now, you want to compute the variance of the vector $u=X\beta$ for some vector $\beta$. This variance is
$$Var(u)=\frac1n(u-\bar u)^T(u-\bar u)=\frac1n(X\beta-\bar X\beta)^T(X\beta-\bar X\beta)\\ =\frac1n\beta^T(X-\bar X)^T(X-\bar X)\beta=\beta^TVar(X)\beta$$