I have a formula for the variance which is of the form
$Var(X) = \frac{1}{n} \sum_{k=-n}^{n} (1-\frac{|k|}{n}) \times f(k)$
The only assumptions I make are:
$1. f(x) \to 0$ as $x \to \infty$
$2. f(x) = f(-x)$
The second assumption means I can express this sum in the form
$\sum_{k=0}^{n} (1-\frac{k}{n}) f(k) - \frac{f(0)}{n}$
Though I'm not sure this helps at all.
I wish to show that as $n \to \infty$ the Variance goes to zero, that is, this sum must go to zero. I have little experience with proving convergence, but I would like to understand how to prove this properly and not just give a hand wavey answer about why it must go to zero.
It is like a Cèsaro mean (which you may consult on wiki).
Suppose $|f(k)|\leq M$ for all $k$ and let $\epsilon>0$. There is $n_0$ so that $|f(k)|<\epsilon$ for $k> n_0$. Then for $n>n_0$: $$ {\rm Var}_n(X) \leq \frac{1}{n} ((2 n_0+1) M + 2 (n-n_0)\epsilon)$$ Letting $n\rightarrow \infty$ we see that $\limsup_n {\rm Var}_n(X) \leq 2\epsilon$ and $\epsilon>0$ was arbitrary.