Let $G$ be abelian with $n$ elements and let $G' = \{g_1 = e, \dots, g_k\} \subsetneq G$ be a (not necessarily minimal) set of generators. An element $g \in G$ is obtained by independently, uniformly at random (repetitions possible) selecting $m$ elements of $G'$ and multiplying them together. Prove there exists $b \in (0,1)$ such that $\lim\limits_{m \to \infty} \frac{1}{b^{2m}} \sum\limits_{x \in G} (\text{Pr}(g = x) - \frac{1}{n})^2$ is finite and non-zero.
Fedja remarked here that "Either you know the basic Fourier analysis on finite groups and then the problem is trivial (the convolution becomes just multiplication on the character group), or you don't and then you have almost no chance." I don't know basic Fourier analysis on finite groups, but I'm hoping to prove him wrong.
My attempt: Let $h_m$ be a random variable indicating the element selected after choosing $m$ elements of $G'.$ Then $\text{Pr}(h_m = x) = \frac{1}{k}\sum\limits_{i=1}^k \text{Pr}(h_m = xg_i^{-1}),$ so $$\left(\text{Pr}(h_m = x) - \frac{1}{n}\right)^2 = \frac{1}{k^2}\left(\sum\limits_{i=1}^k \left( \text{Pr}(h_{m-1} = g_i^{-1}x) - \frac{1}{n}\right)\right)^2 \le \frac{1}{k}\sum\limits_{i=1}^k \left( \text{Pr}(h_{m-1} = g_i^{-1}x) - \frac{1}{n}\right)^2$$ by Cauchy-Schwarz. Since the list of $nk$ elements $g_i^{-1}x, x \in G$ contains every element of $G$ exactly $k$ times, we get $$\sum\limits_{x \in G} \left(\text{Pr}(h_m = x) - \frac{1}{n}\right)^2 \le \sum\limits_{x \in G} \left(\text{Pr}(h_{m-1} = x) - \frac{1}{n}\right)^2.$$
This proves that the limit without the $b^{-2m}$ term exists by monotone boundedness. However, this limit is certainly zero, which is why the term is present in the first place. How do I obtain a finer estimate that allows me to deal with the entire limit?
I'll post a slight variant of the Markov chain proof here for completeness.
Set up a Markov chain with states $G$ and transition given by multiplying by the element $\frac1k\sum g_i$ of the group algebra $\mathbb{R}[G]$, i.e., $$ \mathbb{P}(X_{n+1}=gg_i\mid X_n=g)=\frac1k\quad\forall g\in G\quad\forall g_i\in G'. $$ The chain is
So the chain is ergodic and thus has a unique limiting distribution (state space finite) which is obviously $\frac1n$ on each $g$.
We claim the eigenvectors of $\frac1k\sum g_i$ are orthogonal. To do this, note that the transpose matrix represents multiplication by $\frac1k\sum g_i^{-1}$ (This is vfiroiu's argument). $\mathbb{R}[G]$ is a commutative ring since $G$ is abelian, so the matrices commute, giving the transition matrix is normal. Hence the spectral theorem for normal matrices gives the existence of an orthonormal basis of eigenvectors, and we know the remaining eigenvalues are $<1$ in modulus. Now write our starting position $X_0=e$ as a linear combination of these eigenvectors and select $$ b=\max\{\lvert\lambda\rvert:\lambda\text{ is an eigenvalue}\neq 1,\quad\operatorname{proj}_{\lambda}e\neq 0\} $$ where $\operatorname{proj}_\lambda$ projects onto the eigenspace of eigenvalue $\lambda$.
Generalization to nonabelian $G$: There exists $b<1$ such that $$ \frac1{c^{2m}}\sum_{x\in G}\left[\mathbb{P}(g=x)-\frac1n\right]^2= \begin{cases} 0& c>b\\ \infty & 0<c<b \end{cases} $$ The proof is similar to the abelian case. We still have ergodicity, but we lost the orthogonality of eigenvectors and possible non-diagonalizability, hence the inability to conclude what happens when $c=b$. However, the orthogonal complement of $\sum_{g\in G} g$ is still preserved, so we can project down and compute spectral radius on the subrepresentations where it is nonzero.