Consider a discrete distribution like this one: $[0.6,0.15,0.1,0.08,0.05,0.02]$
Its entropy is $-\sum p_i\log p_i = 1.805$ , and its variance is $\frac{\sum_i(p_i - \bar{p})^2}{n} = 0.039188$
They both measure the spread of this distribution. For distributions like this that are far from uniform, what information does one capture that the other does not?
Variance is sensitive to the scale of the distribution while entropy is not. If $X$ is a random variable with finite support, then $X$ and $100X$ have the same entropy, but different variances.