If $f(x), g(x) \in L^2(\mathbb{R})$ and
$\lim\limits_{h\to\infty}\int_{\mathbb{R}}|f_h(x)-g(x)|^2dx=0$,
where $f_h(x):=\frac{f(x+h)-f(x)}{h}$ for any $h\neq 0$,
show that $f(x)=\int_{[0,x]}g(t)dt+C$ for some constant C.
If $f(x), g(x) \in L^2(\mathbb{R})$ and
$\lim\limits_{h\to\infty}\int_{\mathbb{R}}|f_h(x)-g(x)|^2dx=0$,
where $f_h(x):=\frac{f(x+h)-f(x)}{h}$ for any $h\neq 0$,
show that $f(x)=\int_{[0,x]}g(t)dt+C$ for some constant C.
Since $$ \lim_{h\to0}\int_0^x\left|\frac{f(t+h)-f(t)}{h}-g(t)\right|^2\,\mathrm{d}t=0\tag{1} $$ Hölder's Inequality says $$ \lim_{h\to0}\int_0^x\left|\frac{f(t+h)-f(t)}{h}-g(t)\right|\,\mathrm{d}t=0\tag{2} $$ There is no way to derive the continuity of $f$ since changing $f$ on a set of measure $0$ will not affect $(1)$. However, the Lebesgue Differentiation Theorem says that $$ \bar{f}(x)=\lim_{h\to0}\frac1h\int_x^{x+h}f(t)\,\mathrm{d}t\tag{3} $$ exists and equals $f(x)$ almost everywhere. Suppose that $\bar{f}\!(a)$ exists. Then $$ \begin{align} \int_0^xg(t)\,\mathrm{d}t &=\int_0^ag(t)\,\mathrm{d}t+\int_a^xg(t)\,\mathrm{d}t\\ &=\int_0^ag(t)\,\mathrm{d}t+\lim_{h\to0}\int_a^x\frac{f(t+h)-f(t)}{h}\,\mathrm{d}t\\ &=\int_0^ag(t)\,\mathrm{d}t +\lim_{h\to0}\frac1h\int_x^{x+h}f(t)\,\mathrm{d}t -\lim_{h\to0}\frac1h\int_a^{a+h}f(t)\,\mathrm{d}t\\ &=\int_0^ag(t)\,\mathrm{d}t+\bar{f}\!(x)-\bar{f}\!(a)\tag{4} \end{align} $$ Equation $(4)$ implies that if $\bar{f}\!(a)$ exists, then $\bar{f}\!(x)$ exists for all $x$, and $$ \bar{f}\!(x)=\int_0^xg(t)\,\mathrm{d}t+\underbrace{\bar{f}\!(a)-\int_0^ag(t)\,\mathrm{d}t}_C\tag{5} $$ Equation $(5)$ implies that $\bar{f}\!(x)$ is absolutely continuous.