I know the formula for Expectation value is $$E(X)=\sum f_ix_i$$ where $f_i$ denotes the PMF(Probability Mass Function) and Variance is $Var(X)=E((X-m)^2)$ where m is E(X).
But what is really the significance of the above calculations? How do they help us to know more about the probability distribution?
The best way to think of expected value is as "average value."
Imagine you had a coin two faces; one with a 6, and the other with a 7. Suppose further that this coin is biased, meaning it doesn't land on the two sides with equal probability. Specifically, suppose it lands on the 7 side with probability $2/3$ and on the 6 side with probability $1/3$.
If you flipped this coin many times (say, 30,000 times), and wrote down the values, what would you get? There's no way of knowing, of course; it would depend on how the coin flips actually landed. But, you know that about $2/3$ of the coin flips would result in a 7, and about $1/3$ of the coin flips would land on a $6$. So, if you averaged all the coin flips together, the result should be close to $$\frac{20,000 \cdot 7 + 10,000 \cdot 6}{30,000} = \frac 2 3 \cdot 7 + \frac 1 3 \cdot 6$$ which is precisely the formula for the coin's expected value, and shows why we should think of it as an average in some sense.
Variance is a bit trickier, but the concept is similar: $(X - \mu)^2$ computes the squared deviation from the mean, which is just what it sounds like; it's the gap from $X$ to its own mean, squared. The variance is the average squared deviation from the mean in precisely the same sense of averaging as above. The meaning of variance is to measure how far a variable is spread out relative to its own mean.
There are other ways that you could measure a variable's spread relative to its own mean (such as $\mathbb E \left| X - \mu \right|$), but the variance has important algebraic properties that make it a very convenient measure to work with.