Assuming a random n-node graph that is undirected, can't have multi-edges or loops, consider a scenario in which each edge exists independently with probability p. Set a random variable X to be the number of isolated nodes.
I found $E[X] = n(1−p)^{n-1}$
I know that $Var[X] = E[X^2] + E[X]^2$
Would I be correct in saying $Var[X] = (n(1−p)^{n-1})^2 + n$ because of the proposition that for the random variable $Sn$ defined as $S_{n} = X_1 + X_2 +... + X_n$, $E[S_n^2] = n$? Because I'm setting $X_i$ to be an indicator variable, where it's one if the i'th node is isolated and 0 otherwise, and $X = X_1 + X_2 +...+X_n$ is essentially the $S_n$
Or is this invalid?
When $(X_i)$ is the sequence of indicator functions for isolation, then $S_n=\sum_{i=1}^n X_i$ will count the isolated nodes.
Now the expectation, $\mathsf E(X_i)$, is the probability that node $i$ is isolated from each of the $n-1$ remaining nodes, which is: $(1-p)^{n-1}$.
Since each $X_i$ has Bernoulli distribution, the expectation of ${X_i}^2$ equals the expectation of $X_i$.$$\begin{align}\mathsf{Var}(X_i)&=\mathsf E({X_i}^2)-\mathsf E(X_i)^2\\[1ex]&=(1-p)^{n-1}\,(1-(1-p)^{n-1})\end{align}$$
So indeed.
$$\begin{align}\mathsf E(S_n) &=\sum_{i=1}^n\mathsf E(X_i)&&\text{linearity of expectation}\\[1ex]&= n\,\mathsf E(X_1)&&\text{identical distributions}\\[1ex]&=n\,(1-p)^{n-1}&&\text{.}\end{align}$$
However for the expectation of the square of the sum, things are not linear. The square of a sum is not the sum of squares : $~(a+b)^2=a^2+b^2+2ab$ .
$$\begin{align}\mathsf E\left({S_n}^2\right)&=\mathsf E\left(\sum_{i=1}^nX_i\cdot\sum_{j=1}^n X_j\right)\\&=\sum_{i=1}^n \mathsf E\left({X_i}^2\right)+2\sum_{1\leqslant i<j\leqslant n}\,\mathsf E(X_iX_j)\\[1ex] &= n\mathsf E({X_1}^2)+ n(n-1)\,\mathsf E(X_1X_2)\\[1ex]&= n~(1-p)^{n-1}+ n(n-1)\,\mathsf E(X_1X_2)\end{align}$$
That leaves you to evaluate the probability that two particular nodes are both isolated, $\mathsf E(X_1X_2)$.