So apparantly there are at least three rules for calculating the variance of a sum; $\mathrm{Var}\left(\sum_{i=1}^nX\right)$. When are each rule valid? Are they equivalent or special cases?
\begin{align*} &\mathrm{Var}\left(\sum_{i=1}^nX\right) &&= \mathrm{Var}\left(nX\right) \\&&&= \mathrm{cov}(X^2,n^2)+\left(\mathrm{Var}(X)+\mathrm{E}(X)^2\right)\left(\mathrm{Var}(n)+\mathrm{E}(n)^2\right)−\left(\mathrm{cov}(X,n)+\mathrm{E}(X)\mathrm{E}(n)\right)^2 \\ &\mathrm{Var}\left(\sum_{i=1}^nX\right) &&= \mathrm{E}\left(\mathrm{Var}\left(\sum_{i=1}^nX\middle|n\right)\right) + \mathrm{Var}\left(\mathrm{E}\left(\sum_{i=1}^nX\middle|n\right)\right) \\ &\mathrm{Var}\left(\sum_{i=1}^nX\right) &&= \sum_{i=1}^n\mathrm{Var}\left(X_i\right) + \sum_{i\neq j}\mathrm{cov}\left(X_i, X_j\right) \end{align*}
They are for different cases.
When $n$ is some constant and $X$ is the same variable.
$\mathsf {Var}(\sum_{k=1}^n X)~{=\mathsf {Var}(nX)\\= n^2\mathsf{Var}(X)}$
When $N$ is a random variable and $X$ is the same variable.
$\mathsf {Var}(\sum_{k=1}^N X)~{=\mathsf {Var}(NX)\\= \mathsf E(N^2X^2)-\mathsf E(NX)^2\\ = \mathsf{Cov}(N^2,X^2)+\mathsf E(N^2)\mathsf E(X^2)-(\mathsf {Cov}(N,X)+\mathsf E(N)\mathsf E(X))^2 \\ \text{etc.}}$
When $n$ is some constant, and $(X_k)$ a sequence of random variables, possibly identically distributed and maybe also independent.
$\mathsf {Var}(\sum_{k=1}^n X_k)~{=\mathsf {Cov}(\sum_{j=1}^n X_j,\sum_{k=1}^n X_k)\\= \sum_{j=1}^n\sum_{j=1}^n \mathsf{Cov}(X_j,X_k) \\ = \sum_{j=1}^n \mathsf {Var}(X_j)+2\sum_{1\leq j< k\leq n}\mathsf{Cov}(X_j,X_k) \\ \overset{\tiny\text{identical}}= n\mathsf {Var}(X_1)+{2\mathsf{Cov}(X_1,X_2)}\\\overset{\tiny\text{& independent}}= n\mathsf{Var}(X_1)}$
When $N$ is a random variable, and $(X_k)$ a sequence of random variables.
$\mathsf {Var}(\sum_{k=1}^N X_k)~{=\mathsf E(\mathsf{Var}(\sum_{k=1}^N X_k\mid N))+\mathsf {Var}(\mathsf{E}(\sum_{k=1}^N X_k\mid N))}$