I'm having problem showing that when a probability density function $f$ is differentiable on $\mathbb{R}$, its first derivative $f'$ is absolutely continuous on $\mathbb{R}$, and second derivative satisfies $\int (f''(x))^2 < \infty$, then $f$ must be bounded on $\mathbb{R}$.
I'm also not sure why the following argument doesn't work given that there exist continuous density functions that are not $L_2$:
Since $\int f(x) dx = 1$, it must be that $f$ is bounded almost everywhere and hence bounded everywhere, say by $B>0$, when $f$ is continuous. Then, $\|f\|_{2}^2 = \int (f(x))^2 dx \leq B \int f(x) dx = B$ so that $f \in L_2(\mathbb{R})$ as long as it is assumed continuous.
Any help would be much appreciated.