We have $(X_i)_{i \in \mathbb Z}$ iid random variables with $1\le X_i \le2$ almost surely.
We define $X(x,\omega) \equiv X_i (\omega)$ if $x\in [i,i+1[$ and $X_\epsilon (x, \omega) \equiv X(x/\epsilon, \omega)$.
Define
$$Y (x,\omega) \equiv \frac 1 {E[\frac 1 {X_1}]} (1- \frac 1 {E[\frac 1 {X_1}] X_\epsilon (x,\omega)})$$
We would like to show that there exists real numbers $\alpha$ and $\beta$ such that
$$\epsilon^\alpha \int_0^1 Y(x, \omega)g(x)\,dx \to \mathcal N(0, \int_0^1\beta g^2(x) \,dx)$$ in distribution when $\epsilon \downarrow 0$ for any sufficiently smooth function $g$.
Another very similar problem is the following, but I don't know if it's harder or easier.
EDIT: the following problem is harder, it has been moved to a separate question How to use the Lindeberg CLT in this scenario ? (analysis problem)
Define
$$u'_\epsilon(x,\omega)= \frac {c_\epsilon(\omega) - F(x)}{X_\epsilon(x,\omega)}$$
where $F$ is an $L^1([0,1])$ function and $c(\omega)$ is defined by
$$c_\epsilon(\omega)\equiv \frac{\int_0^1 \frac{F(y)}{X_\epsilon(y,\omega)} \, dy}{\int_0^1\frac 1 {X_\epsilon(y,\omega)}\, dy}$$
Show that $$\epsilon^\alpha\int_0^1 u'_\epsilon(x,\omega) g(x)\, dx \to \mathcal N(?, ?)$$
for a certain $\alpha\in\mathbb R$, in distribution when $\epsilon \downarrow 0$ for any sufficiently smooth function $g$, where we need to characterize the expectation and the variance.
A hint says for this problem that we should notice that $u'_\epsilon(x,\omega)$ can be written as a product of a random part and a deterministic part + an error that can be controlled in $L^2(]0,1[ \times \Omega)$. We can easily notice that
$$u'_\epsilon(x,\omega)= \frac {F(x)}{X_\epsilon(x,\omega)} + err_\epsilon(x,\omega)$$
where
$$err_\epsilon(x,\omega)=\frac {\int_0^1 \frac {F(y)}{X_\epsilon(y,\omega)} dy}{X_\epsilon(x,\omega)\int_0^1 \frac {1}{X_\epsilon(y,\omega)} dy}$$
I have no idea how to control this and how will this help. The question related to this control can be found here : Controlling this function in $L^2$ norm
Another post that brings more information about $u'_\epsilon$ is the following: Showing an $L^2$ convergence (with convergence rate)
Treating first the stochastic part mentioned in the hint:
\begin{align} \epsilon^\alpha\int_0^1\frac{g(x)dx}{X_\epsilon(x,\omega)} &=\epsilon^\alpha\sum_{j=1}^{\lfloor 1/\epsilon\rfloor}\int_{(j-1)\epsilon}^{j\epsilon}\frac{g(x)dx}{X_\epsilon(x,\omega)} + \epsilon^\alpha\int_{\lfloor 1/\epsilon\rfloor\epsilon}^1\frac{g(x)dx}{X_\epsilon(x,\omega)}\\ &=\epsilon^\alpha\sum_{j=1}^{\lfloor 1/\epsilon\rfloor}\int_{(j-1)\epsilon}^{j\epsilon}\frac{g(x)dx}{X_{j-1}(\omega)} +\epsilon^\alpha\int_{\lfloor 1/\epsilon\rfloor\epsilon}^1\frac{g(x)dx}{X_\epsilon(x,\omega)}\\ &=\epsilon^\alpha\sum_{j=1}^{\lfloor 1/\epsilon\rfloor}\frac{1}{X_{j-1}(\omega)}\int_{(j-1)\epsilon}^{j\epsilon}g(x) + \epsilon^\alpha\int_{\lfloor 1/\epsilon\rfloor\epsilon}^1\frac{g(x)dx}{X_\epsilon(x,\omega)}\\ &=\epsilon^\alpha\sum_{j=1}^{\lfloor 1/\epsilon\rfloor} \left\{\frac{1}{X_{j-1}(\omega)}\int_{(j-1)\epsilon}^{j\epsilon}g(x) - E(1/X_1)\int_{(j-1)\epsilon}^{j\epsilon}g(x)\right\} + \epsilon^\alpha E(1/X_1)\int_0^1 g(x)dx + \epsilon^\alpha\int_{\lfloor1/\epsilon\rfloor\epsilon}^1g(x)[E(1/X_1)-1/X_\epsilon(x,\omega)]dx. \end{align} In the second to last line, I want to apply a CLT to the sum. Because of the integrals, the terms are changing as terms are added to the sum (i.e., as $\epsilon\to 0$), so I need Lindeberg's CLT. The "Lindeberg condition" is met since the variables are bounded, but the terms need to be standardized. In the last line I centered the summands, and next the variance of the sum is \begin{align*} \newcommand{\Var}{\operatorname{Var}} \Var &\epsilon^\alpha\sum_{j=1}^{\lfloor 1/\epsilon\rfloor} \left\{\frac{1}{X_{j-1}(\omega)}\int_{(j-1)\epsilon}^{j\epsilon}g(x) - E(1/X_1)\int_{(j-1)\epsilon}^{j\epsilon}g(x)\right\}\\ &=\epsilon^{2\alpha}\sum_{j=1}^{\lfloor 1/\epsilon\rfloor}\left(\int_{(j-1)\epsilon}^{j\epsilon}g(x)\right)^2\Var(1/X_1)\\ &=\epsilon^{2\alpha}\sum_{j=1}^{\lfloor 1/\epsilon\rfloor}\left(g(x_j)\epsilon\right)^2\Var(1/X_1)\text{, where $x_j\in[(j-1)\epsilon,j\epsilon)$}\\ &=\sum_{j=1}^{\lfloor 1/\epsilon\rfloor}g(x_j)^2\epsilon\Var(1/X_1) \text{, taking $\alpha=-1/2$} \\ &\to \Var(1/X_1)\int_0^1g(x)^2dx, \end{align*} as $\epsilon\to 0$.
In the first display the last term in the last line $\epsilon^\alpha\int_{\lfloor1/\epsilon\rfloor\epsilon}^1g(x)[E(1/X_1)-1/X_\epsilon(x,\omega)]dx$ is negligible for this choice of $\alpha$ as $$ \epsilon^\alpha\int_{\lfloor1/\epsilon\rfloor\epsilon}^1g(x)[E(1/X_1)-1/X_\epsilon(x,\omega)]dx <\epsilon^\alpha||g||_\infty(2*2)(1-\lfloor1/\epsilon\rfloor\epsilon)<\epsilon^{1/2}4||g||_\infty\to 0. $$ Let $\beta=\Var(1/X_1).$ Putting things together and applying Slutsky's lemma, $$ \epsilon^\alpha\int_0^1\frac{g(x)dx}{X_\epsilon(x,\omega)} - \epsilon^\alpha E(1/X_1)\int_0^1 g(x)dx \leadsto\mathcal{N}(0,\beta\int_0^1g(x)^2dx). $$ The term on the lhs is \begin{align} \epsilon^\alpha &\int_0^1\frac{g(x)dx}{X_\epsilon(x,\omega)} - \epsilon^\alpha E(1/X_1)\int_0^1 g(x)dx\\ &=\epsilon^\alpha\int_0^1\left(\frac{1}{X_\epsilon(x,\omega)}-E(1/X_1)\right)g(x)dx\\ &=\epsilon^\alpha\int_0^1\left((-E(1/X_1))(1-\frac{1}{E(1/X_1)X_\epsilon(x,\omega)})\right)g(x)dx, \end{align}
leading me to wonder whether there is a typo: $$ Y'(x,\omega) = (-E(1/X_1))(1-\frac{1}{E(1/X_1)X_\epsilon(x,\omega)})? $$ In any event $Y'$ differs from $Y$ by a constant factor, so you can obtain your result from this one by adjusting $\beta$.