Consider a sum:
$$ I_N(x) = \sum^N_{i=0} h_i(x) \eta_i $$
Where
$$ h_i(x) < \infty\:\: \forall\:\: i$$
And $\eta_i$ are independent and identically sampled Gaussian random variables. Is there a central limit theorem for the case
$$ \lim_{N\rightarrow \infty} I_N(x) = \sum^\infty_{i=0} h_i(x) \eta_i $$
i.e.is there a proof that $\lim_{N\rightarrow \infty} I_N(x) $ is Gaussian for all $x$?
UPDATE: In answer to Arnaud Mégret's comment, $x\in X$ is just a deterministic value, a way of stating that $h_i(x) \neq \text{Constant}$. As an example they could be Legendre polynomials. Of course for any given $x=a=\text{Constant}$, $h_i(a) = \text{Constant}$ and therefore by the weighted Gaussian sum we know that $\lim_{N\rightarrow \infty} I_N(a) $ is indeed Gaussian. This would naturally extend to for all $x$. However I was trying to find a more rigorous proof of this (as its a tad hand wavy) within the context of the central limit theorem(s). One way would be proof by induction? i.e. if it's Gaussian for $x=a_n < \infty$ then it must also be Gaussian for $x=a_{n+1} < \infty$. Hence proving this for all $a_{n}\in X$. I'm sure this must have been done somewhere in a book or journal and I am looking for such a reference and the name of that particular central limit theorem (since there appears to be more than one).