We have independent identically distributed random variables $X_i$, which can only take values on selected integers with the following probabilities: $$p(1) = 70/128$$ $$p(9) = 42/128$$ $$p(25) = 14/128$$ $$p(49) = 2/128$$ probability for any other integers are $0$.
Numerical simulations show that the sample average: $${\displaystyle {\bar {X}}_{n}\equiv {\frac {X_{1}+\cdots +X_{n}}{n}}}$$ does NOT follow central limit theorem (there is no sign of converge to a normal distribution), why ?
The CLT guarantees that $\bar{X}$ converges to a normal distribution as $n \rightarrow \infty$, but it makes no guarantees about how quick that convergence will be. Here's what $X_1$ looks like as a line plot:
Here's what $\sum_{i = 1}^n X_i$ looks like when $n = 2$:
It's shifted a tiny bit towards being more symmetrical, but I will admit that it doesn't look very bell-curve-like yet. If we go to $n = 8$, it looks something like this:
Now there is a clear bell-curve appearing. It's still spiky, because there are plenty of values that it can't take, but the behaviour matches what the CLT expects.
If we take it up to $n = 256$, then we have something that is indisputably bell-shaped, although at this scale you can actually see banding of two different bell curves hiding in there:
If we were then to scale this, by taking $\frac{\sum X_i - n E(X)}{\sqrt{n Var(X)}}$, and plot it as a histogram, then you'd get something very close to a standard normal distribution.
Since I've updated the images to fix an error in how I calculated the probabilities, here's the hacked-together R code I used so that others can confirm that I didn't make any other mistakes: