I just started learning applied probability. What does mean and variance actually signify? If we want to relate the event of getting head or a tail on a coin toss to $X = 0, 1$ respectively, our mean (or expected value) would be $0(\frac{1}{2}) + 1(\frac{1}{2}) = \frac{1}{2} $; by following the definition: $ E[X] = \sum_{i}x_ip_{X}(x_i) $.
But rather if we consider the values of $X = 1, 2$ to be mapped with the events of getting a head or a tail, our mean would now become $1(\frac{1}{2}) + 2(\frac{1}{2}) = 1.5$.
There is nothing in-between a head or a tail in this experiment. Moreover, the mean or variance change depending on how the random variables are chosen. (If mean varies, it is obvious that variance also varies). So what does mean and variance actually signify, especially if $X$ is discreet and each value of $X$ is mapped with totally independent events.
The Wikipedia-style definitions are: the mean is the expected value and the variance measures how far the set of possible outcomes is spread out.
I myself find nothing wrong with the above, it is both intuitive and correct. Describing the outcome heads as $0$ and tails as $1$ gives $$E[X]=\frac{1}{2}$$ which can be interpreted as: the expected value is evenly far away from both $0$ (heads) and $1$ (tails), i.e. both events are evenly likely. Translating everything over the real line to $Y\in\{1,2\}$ gives $$E[Y]=\frac{3}{2}$$ and you can draw the same conclusion.
As for the variances, note that these do not change when this translation is applied, i.e. $\mathrm{Var}(X)=\mathrm{Var}(Y)$ (or more generally $\mathrm{Var}(X)=\mathrm{Var}(X+c)$ for constant $c$.)
Namely: $$\mathrm{Var}(X)=E[X^2]-E[X]^2=\frac{1}{2}-\left(\frac{1}{2}\right)^2=\frac{1}{4}$$ and $$\mathrm{Var}(Y)=E[Y^2]-E[Y]^2=\frac{5}{2}-\left(\frac{3}{2}\right)^2=\frac{1}{4},$$ which is again intuitively correct. You simply assign values ($0$ and $1$ or $1$ and $2$) to the outcomes heads and tails, the spread of the outcomes should not depend on this assignment.
Hope this helps
:)