I have the following formula for variance
(1) $Var(X) = E((X - \mu)^2)$
(2) $Var(X) = E(X^2) - E(X)^2$ (2)
(3) $Var(X) = \sum(x_i-\mu_x)^2p_i$
I know how to get from (1) to (2), but if I'm given $n$ numbers and told to compute the variance of them, I can't use (1) or (2) but I think (3). The most I've proved to myself is that $E(X)$ for a uniform distribution is the same as the "mean" or "average".
The first two formulas are the definition of variance $$ Var(X) = E(X - EX)^2 = EX^2 - E^2X. $$ Now, if $X$ is continuous random variable then $$ Var(X) = \int (x-\mu_X)^2f_X(x)dx, $$ if $X$ is discrete, then $$ Var(X) = \sum_{x } (x-\mu_x)^2P(X=x). $$
If you have $n$ numbers $x_1,...,x_n$, i.e., $n$ realization of some random variable $X$ then its sample variance can be computed by $$ S^2 = \frac{1}{n-1} \sum_{i=1}^n (x_i - \bar{x}_n)^2. $$