I have read both my textbook and the wikipedia page but there is a nuance about the notation that I am lost about.
The variance is defined as:
$E((X-\mu)^2)$
Assume we are on a finite case.
Does that mean that the expansion is:
$\sum (x_i-\mu)^2p(x_i)$
Or:
$\sum (x_ip(x_i)-\mu)^2$
You want the average squared deviation. First compute the average $\mu$ by the usual formula:
$$ \mu = \sum_{x \in \Omega} xp(x) $$
if $X$ is a discrete random variable, or
$$ \mu = \int_{\Omega} x \ dF(x) = \int_{\Omega} x f(x) \ dx $$
where $f$ is the density and $F$ is the CDF.
Once you have this, you'll now find the deviations $x - \mu$, square them all, and find their average. That is, you'll compute
$$ \sigma^2 = \sum_{x \in \Omega} (x - \mu)^2p(x) $$
which gives the variance. (A similar formula is available for continuous random variables.) I challenge you to use this to derive the shortcut formula $\sigma^2 = E(X^2) - \mu^2$ where $E(x^2)$ is the average of the squares!