My book has two examples of computing $E(X^2)$
- Let X be the score on a fair die
$E(X^2) = \frac{1}{6}(1^2 + 2^2+3^2+4^2+5^2+6^2)$
- Let X be the number of fixed points in a permutation
$E(X^2)=\sum_i^nE(X_i)^2 + \sum_{i\neq j}E(X_iX_j)$
I understand that the second one comes from the fact that $X = X_1+X_2+...X_n$ so $X^2 = (X = X_1+X_2+...X_n)(X = X_1+X_2+...X_n)$ which is where the two summations come from.
The part I don't understand is how the two methods relate. Can we express the score on a fair die in the form of the second definition, or is the second definition reserved only for indicator r.v.'s ?
In your first case $X$ is not a sum, just a random variable.
In the first case, you would have $E(X^2)=\frac{91}{6}$ and also $E(X)=\frac72$
In your second you should probably have written $E(X^2)=\sum_i^nE(X_i^2) + \sum_{i\neq j}E(X_iX_j)$.
While your two statements are not really related, you could use them together, for example to find the second moment of the sum of $n$ independent fair dice:
$$ \frac{91}{6}n+ \left(\frac72\right)^2n(n-1) = \frac{35}{12}n +\frac{49}{4}n^2$$ which would lead to you finding the variance of the sum being $\frac{35}{12}n$