The Central Limit Theorem is well known in statistics. It states:
When there is a large number of random variables with same random distribution
and finite second order moment the mean value of these variables will tend
to the normal distribution
If we remove one of these constraints: same random distribution. What is the "simplest" (in some suitable sense) sequence of random distributions we can prove won't tend to normal distribution?
A sufficient condition for the CLT to hold is that the variables (besides having finite variance) are independent and identically distributed. It can also hold for not identically distributed variables, but then some additional conditions on the moments are necessary.
A typical example in which the conditions are not fullfilled is an autoregresive process: let $x_n= a x_{n-1} + e_n$ where $|a|<1$ and $e_n$ is iid (white noise) following some distribution with (say) zero mean and finite variance $\sigma^2_e$.
Then, solving that iteratively in terms of $e_n$ we get
$$\begin{align}x_n&=e_n + a e_{n-1} + a^2 e_{n-2} + \cdots\\ &= u_0 + u_{1}+u_{2}+ \cdots \end{align}$$
where $u_{k}=a^k e_{n-k}$ are independent random variables, but not identically distributed (in particular, $Var(u_k)= a^{2k} \sigma^2_e$)
That $x_n$ does not follow a normal distribution in general (it's of course normal if $e_n$ is normal) can be seen by considering a bounded $e_n$, for example, $e_n \sim U[-1,1]$. In this case $|x_n|\le 1 + a + a^2 \cdots = (1-a)^{-1}$, hence $x_n$ is also bounded, it cannot be normal.